[ 466.541961] env[60738]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 467.182254] env[60788]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 468.528557] env[60788]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60788) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 468.530968] env[60788]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60788) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 468.530968] env[60788]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60788) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 468.530968] env[60788]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 468.733787] env[60788]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60788) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 468.743900] env[60788]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=60788) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 468.849502] env[60788]: INFO nova.virt.driver [None req-0ed7968a-26b6-4e9a-a2df-8d8b7e51d0b9 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 468.920754] env[60788]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 468.920909] env[60788]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 468.921023] env[60788]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60788) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 471.815014] env[60788]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-2d798013-b53d-40f9-9e55-20332a9da1f9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.832589] env[60788]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60788) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 471.832739] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-98ead2f6-861c-4468-8aaa-af6b0494ea8c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.872248] env[60788]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 10527. [ 471.872458] env[60788]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.951s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 471.873138] env[60788]: INFO nova.virt.vmwareapi.driver [None req-0ed7968a-26b6-4e9a-a2df-8d8b7e51d0b9 None None] VMware vCenter version: 7.0.3 [ 471.877267] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5583e56f-b3e0-40ce-a1c2-ec9772f6bace {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.894828] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09deb2d8-ba12-4b44-b3e4-82e36767f3a0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.900714] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cf3138b-8438-4ad7-be82-e1040188b74e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.907328] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb6c7761-605d-49f3-89ce-22a306c52be5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.920378] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed175cd8-f67d-4ed9-b6e9-d04c11f6327b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.926188] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e769b52-1845-4fc8-8693-17f66ebcdea7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.957792] env[60788]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-53b9a38c-3ab0-4fcd-a92f-e26024b84978 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 471.963128] env[60788]: DEBUG nova.virt.vmwareapi.driver [None req-0ed7968a-26b6-4e9a-a2df-8d8b7e51d0b9 None None] Extension org.openstack.compute already exists. {{(pid=60788) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 471.965727] env[60788]: INFO nova.compute.provider_config [None req-0ed7968a-26b6-4e9a-a2df-8d8b7e51d0b9 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 471.982969] env[60788]: DEBUG nova.context [None req-0ed7968a-26b6-4e9a-a2df-8d8b7e51d0b9 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),2acd5d50-2dae-4f29-9279-437956b976b2(cell1) {{(pid=60788) load_cells /opt/stack/nova/nova/context.py:464}} [ 471.984881] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 471.985125] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 471.985801] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 471.986218] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Acquiring lock "2acd5d50-2dae-4f29-9279-437956b976b2" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 471.986412] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Lock "2acd5d50-2dae-4f29-9279-437956b976b2" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 471.987363] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Lock "2acd5d50-2dae-4f29-9279-437956b976b2" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 472.012387] env[60788]: INFO dbcounter [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Registered counter for database nova_cell0 [ 472.020922] env[60788]: INFO dbcounter [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Registered counter for database nova_cell1 [ 472.023939] env[60788]: DEBUG oslo_db.sqlalchemy.engines [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60788) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 472.024308] env[60788]: DEBUG oslo_db.sqlalchemy.engines [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60788) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 472.028884] env[60788]: DEBUG dbcounter [-] [60788] Writer thread running {{(pid=60788) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 472.029645] env[60788]: DEBUG dbcounter [-] [60788] Writer thread running {{(pid=60788) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 472.031789] env[60788]: ERROR nova.db.main.api [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 472.031789] env[60788]: result = function(*args, **kwargs) [ 472.031789] env[60788]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 472.031789] env[60788]: return func(*args, **kwargs) [ 472.031789] env[60788]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 472.031789] env[60788]: result = fn(*args, **kwargs) [ 472.031789] env[60788]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 472.031789] env[60788]: return f(*args, **kwargs) [ 472.031789] env[60788]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 472.031789] env[60788]: return db.service_get_minimum_version(context, binaries) [ 472.031789] env[60788]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 472.031789] env[60788]: _check_db_access() [ 472.031789] env[60788]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 472.031789] env[60788]: stacktrace = ''.join(traceback.format_stack()) [ 472.031789] env[60788]: [ 472.032816] env[60788]: ERROR nova.db.main.api [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 472.032816] env[60788]: result = function(*args, **kwargs) [ 472.032816] env[60788]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 472.032816] env[60788]: return func(*args, **kwargs) [ 472.032816] env[60788]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 472.032816] env[60788]: result = fn(*args, **kwargs) [ 472.032816] env[60788]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 472.032816] env[60788]: return f(*args, **kwargs) [ 472.032816] env[60788]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 472.032816] env[60788]: return db.service_get_minimum_version(context, binaries) [ 472.032816] env[60788]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 472.032816] env[60788]: _check_db_access() [ 472.032816] env[60788]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 472.032816] env[60788]: stacktrace = ''.join(traceback.format_stack()) [ 472.032816] env[60788]: [ 472.033226] env[60788]: WARNING nova.objects.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Failed to get minimum service version for cell 2acd5d50-2dae-4f29-9279-437956b976b2 [ 472.033345] env[60788]: WARNING nova.objects.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 472.033777] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Acquiring lock "singleton_lock" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 472.033944] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Acquired lock "singleton_lock" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 472.034237] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Releasing lock "singleton_lock" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 472.034567] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Full set of CONF: {{(pid=60788) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 472.034712] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ******************************************************************************** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 472.034842] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] Configuration options gathered from: {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 472.034978] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 472.035197] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 472.035329] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ================================================================================ {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 472.035537] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] allow_resize_to_same_host = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.035710] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] arq_binding_timeout = 300 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.035841] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] backdoor_port = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.035968] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] backdoor_socket = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.036149] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] block_device_allocate_retries = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.036316] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] block_device_allocate_retries_interval = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.036489] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cert = self.pem {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.036652] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.036818] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute_monitors = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.036990] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] config_dir = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.037211] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] config_drive_format = iso9660 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.037353] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.037524] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] config_source = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.037696] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] console_host = devstack {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.037890] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] control_exchange = nova {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.038084] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cpu_allocation_ratio = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.038255] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] daemon = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.038426] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] debug = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.038588] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] default_access_ip_network_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.038758] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] default_availability_zone = nova {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.038945] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] default_ephemeral_format = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.039169] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] default_green_pool_size = 1000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.039450] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.039710] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] default_schedule_zone = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.039902] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] disk_allocation_ratio = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.040085] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] enable_new_services = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.040338] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] enabled_apis = ['osapi_compute'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.040629] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] enabled_ssl_apis = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.040831] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] flat_injected = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.041009] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] force_config_drive = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.041184] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] force_raw_images = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.041359] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] graceful_shutdown_timeout = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.041525] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] heal_instance_info_cache_interval = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.041746] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] host = cpu-1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.041950] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.042157] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.042328] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.042544] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.042708] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_build_timeout = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.042873] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_delete_interval = 300 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.043056] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_format = [instance: %(uuid)s] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.043234] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_name_template = instance-%08x {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.043400] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_usage_audit = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.043600] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_usage_audit_period = month {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.043780] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.043953] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.044142] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] internal_service_availability_zone = internal {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.044305] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] key = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.044467] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] live_migration_retry_count = 30 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.044634] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_config_append = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.044805] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.045009] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_dir = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.045237] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.045376] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_options = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.045543] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_rotate_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.045716] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_rotate_interval_type = days {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.045888] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] log_rotation_type = none {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.046033] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.046225] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.046429] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.046606] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.046740] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.046907] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] long_rpc_timeout = 1800 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.047090] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] max_concurrent_builds = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.047258] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] max_concurrent_live_migrations = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.047420] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] max_concurrent_snapshots = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.047581] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] max_local_block_devices = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.047742] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] max_logfile_count = 30 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.047929] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] max_logfile_size_mb = 200 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.048111] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] maximum_instance_delete_attempts = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.048286] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metadata_listen = 0.0.0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.048456] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metadata_listen_port = 8775 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.048624] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metadata_workers = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.048786] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] migrate_max_retries = -1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.048980] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] mkisofs_cmd = genisoimage {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.049213] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.049349] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] my_ip = 10.180.1.21 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.049515] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] network_allocate_retries = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.049696] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.049864] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.050041] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] osapi_compute_listen_port = 8774 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.050219] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] osapi_compute_unique_server_name_scope = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.050387] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] osapi_compute_workers = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.050552] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] password_length = 12 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.050715] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] periodic_enable = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.050877] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] periodic_fuzzy_delay = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.051062] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] pointer_model = usbtablet {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.051240] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] preallocate_images = none {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.051402] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] publish_errors = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.051534] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] pybasedir = /opt/stack/nova {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.051692] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ram_allocation_ratio = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.051858] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rate_limit_burst = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.052066] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rate_limit_except_level = CRITICAL {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.052240] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rate_limit_interval = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.052404] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reboot_timeout = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.052568] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reclaim_instance_interval = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.052729] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] record = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.052900] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reimage_timeout_per_gb = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.053083] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] report_interval = 120 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.053250] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rescue_timeout = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.053413] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reserved_host_cpus = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.053573] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reserved_host_disk_mb = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.053733] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reserved_host_memory_mb = 512 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.053893] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] reserved_huge_pages = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.054070] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] resize_confirm_window = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.054236] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] resize_fs_using_block_device = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.054397] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] resume_guests_state_on_host_boot = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.054567] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.054730] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rpc_response_timeout = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.054898] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] run_external_periodic_tasks = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.055106] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] running_deleted_instance_action = reap {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.055276] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.055439] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] running_deleted_instance_timeout = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.055600] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler_instance_sync_interval = 120 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.055770] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_down_time = 720 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.055944] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] servicegroup_driver = db {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.056126] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] shelved_offload_time = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.056293] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] shelved_poll_interval = 3600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.056462] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] shutdown_timeout = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.056628] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] source_is_ipv6 = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.056788] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ssl_only = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.057051] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.057230] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] sync_power_state_interval = 600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.057395] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] sync_power_state_pool_size = 1000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.057567] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] syslog_log_facility = LOG_USER {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.057726] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] tempdir = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.057918] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] timeout_nbd = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.058113] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] transport_url = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.058283] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] update_resources_interval = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.058446] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_cow_images = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.058609] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_eventlog = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.058771] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_journal = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.058962] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_json = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.059132] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_rootwrap_daemon = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.059271] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_stderr = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.059429] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] use_syslog = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.059587] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vcpu_pin_set = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.059754] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plugging_is_fatal = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.059920] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plugging_timeout = 300 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.060098] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] virt_mkfs = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.060265] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] volume_usage_poll_interval = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.060424] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] watch_log_file = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.060592] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] web = /usr/share/spice-html5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 472.060779] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_concurrency.disable_process_locking = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.061087] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.061277] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.061446] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.061620] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.061793] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.061963] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.062167] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.auth_strategy = keystone {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.062341] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.compute_link_prefix = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.062519] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.062698] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.dhcp_domain = novalocal {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.062870] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.enable_instance_password = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.063075] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.glance_link_prefix = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.063243] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.063418] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.063582] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.instance_list_per_project_cells = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.063749] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.list_records_by_skipping_down_cells = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.063917] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.local_metadata_per_cell = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.064104] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.max_limit = 1000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.064285] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.metadata_cache_expiration = 15 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.064458] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.neutron_default_tenant_id = default {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.064627] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.use_forwarded_for = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.064797] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.use_neutron_default_nets = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.064979] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.065190] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.065338] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.065514] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.065691] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_dynamic_targets = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.065884] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_jsonfile_path = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.066091] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.066291] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.backend = dogpile.cache.memcached {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.066462] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.backend_argument = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.066636] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.config_prefix = cache.oslo {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.066806] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.dead_timeout = 60.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.066973] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.debug_cache_backend = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.067154] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.enable_retry_client = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.067318] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.enable_socket_keepalive = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.067490] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.enabled = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.067656] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.expiration_time = 600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.067817] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.hashclient_retry_attempts = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.068044] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.068232] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_dead_retry = 300 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.068407] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_password = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.068575] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.068741] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.068906] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_pool_maxsize = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.069086] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.069255] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_sasl_enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.069436] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.069605] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.069774] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.memcache_username = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.069943] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.proxies = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.070133] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.retry_attempts = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.070298] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.retry_delay = 0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.070464] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.socket_keepalive_count = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.070625] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.socket_keepalive_idle = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.070788] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.socket_keepalive_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.070957] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.tls_allowed_ciphers = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.071139] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.tls_cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.071303] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.tls_certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.071466] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.tls_enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.071629] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cache.tls_keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.071801] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.071981] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.auth_type = password {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.072165] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.072345] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.072509] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.072678] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.072843] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.cross_az_attach = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.073015] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.debug = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.073193] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.endpoint_template = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.073359] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.http_retries = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.073523] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.073685] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.073858] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.os_region_name = RegionOne {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.074039] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.074208] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cinder.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.074382] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.074547] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.cpu_dedicated_set = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.074708] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.cpu_shared_set = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.074877] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.image_type_exclude_list = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.075058] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.075232] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.075397] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.075563] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.075743] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.075951] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.resource_provider_association_refresh = 300 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.076149] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.shutdown_retry_interval = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.076340] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.076523] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] conductor.workers = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.076704] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] console.allowed_origins = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.076870] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] console.ssl_ciphers = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.077059] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] console.ssl_minimum_version = default {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.077240] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] consoleauth.token_ttl = 600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.077415] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.077579] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.077809] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.077920] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.078111] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.078282] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.078497] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.078703] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.078937] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.079237] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.079237] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.region_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.079392] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.079569] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.service_type = accelerator {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.080173] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.080173] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.080173] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081594] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081594] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081594] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] cyborg.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081594] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.backend = sqlalchemy {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081594] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.connection = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081594] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.connection_debug = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081782] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.connection_parameters = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081782] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.connection_recycle_time = 3600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.081782] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.connection_trace = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082033] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.db_inc_retry_interval = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082109] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.db_max_retries = 20 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082284] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.db_max_retry_interval = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082465] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.db_retry_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082653] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.max_overflow = 50 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082832] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.max_pool_size = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.082998] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.max_retries = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.083190] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.083356] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.mysql_wsrep_sync_wait = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.083524] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.pool_timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.083696] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.retry_interval = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.083859] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.slave_connection = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.084040] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.sqlite_synchronous = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.084212] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] database.use_db_reconnect = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.084398] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.backend = sqlalchemy {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.084578] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.connection = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.084750] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.connection_debug = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.084922] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.connection_parameters = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.085102] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.connection_recycle_time = 3600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.085272] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.connection_trace = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.085436] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.db_inc_retry_interval = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.085601] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.db_max_retries = 20 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.085764] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.db_max_retry_interval = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.085948] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.db_retry_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.086150] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.max_overflow = 50 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.086318] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.max_pool_size = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.086490] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.max_retries = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.086667] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.086830] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.086993] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.pool_timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.087183] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.retry_interval = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.087348] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.slave_connection = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.087516] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] api_database.sqlite_synchronous = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.087693] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] devices.enabled_mdev_types = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.087890] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.088081] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ephemeral_storage_encryption.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.088259] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.088433] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.api_servers = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.088600] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.088763] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.088928] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.089137] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.089308] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.089473] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.debug = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.089642] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.default_trusted_certificate_ids = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.089806] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.enable_certificate_validation = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.089971] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.enable_rbd_download = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.090153] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.090321] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.090485] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.090647] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.090807] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.090973] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.num_retries = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.091164] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.rbd_ceph_conf = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.091331] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.rbd_connect_timeout = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.091503] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.rbd_pool = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.091674] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.rbd_user = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.091839] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.region_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.092015] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.092236] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.service_type = image {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.092410] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.092574] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.092738] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.092900] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.093101] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.093274] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.verify_glance_signatures = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.093437] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] glance.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.093606] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] guestfs.debug = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.093781] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.config_drive_cdrom = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.093949] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.config_drive_inject_password = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.094133] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.094303] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.094466] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.enable_remotefx = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.094639] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.instances_path_share = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.094807] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.iscsi_initiator_list = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.094971] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.limit_cpu_features = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.095153] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.095319] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.095483] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.095652] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.095859] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.096066] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.use_multipath_io = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.096243] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.096410] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.096574] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.vswitch_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.096737] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.096904] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] mks.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.097282] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.097479] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] image_cache.manager_interval = 2400 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.097654] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] image_cache.precache_concurrency = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.097826] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] image_cache.remove_unused_base_images = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.098035] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.098219] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.098399] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] image_cache.subdirectory_name = _base {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.098575] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.api_max_retries = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.098824] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.api_retry_interval = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.098896] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.099071] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.auth_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.099241] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.099402] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.099567] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.099739] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.conductor_group = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.099904] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.100077] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.100242] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.100407] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.100567] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.100727] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.100888] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.101066] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.peer_list = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.101231] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.region_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.101397] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.serial_console_state_timeout = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.101556] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.101729] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.service_type = baremetal {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.101892] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.102062] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.102224] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.102384] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.102564] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.102730] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ironic.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.102909] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.103097] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] key_manager.fixed_key = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.103286] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.103451] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.barbican_api_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.103615] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.barbican_endpoint = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.103786] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.barbican_endpoint_type = public {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.103944] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.barbican_region_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.104148] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.104330] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.104497] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.104660] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.104816] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.104979] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.number_of_retries = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.105156] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.retry_delay = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.105320] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.send_service_user_token = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.105485] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.105643] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.105805] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.verify_ssl = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.105965] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican.verify_ssl_path = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.106153] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.106317] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.auth_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.106477] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.106635] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.106797] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.106956] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.107128] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.107291] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.107448] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] barbican_service_user.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.107612] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.approle_role_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.107772] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.approle_secret_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.107954] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.108140] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.108306] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.108467] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.108624] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.108793] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.kv_mountpoint = secret {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.108992] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.kv_path = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.109188] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.kv_version = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.109356] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.namespace = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.109519] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.root_token_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.109684] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.109845] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.ssl_ca_crt_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.110019] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.110184] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.use_ssl = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.110356] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.110528] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.110694] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.auth_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.110855] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.111068] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.111324] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.111511] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.111680] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.111848] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.112026] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.112198] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.112364] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.112526] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.112689] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.region_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.112853] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.113039] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.service_type = identity {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.113210] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.113372] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.113534] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.113693] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.113877] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.114056] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] keystone.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.114268] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.connection_uri = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.114434] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_mode = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.114606] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.114775] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_models = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.114952] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_power_governor_high = performance {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.115141] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.115307] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_power_management = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.115482] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.115653] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.device_detach_attempts = 8 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.115819] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.device_detach_timeout = 20 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.116043] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.disk_cachemodes = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.116226] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.disk_prefix = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.116398] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.enabled_perf_events = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.116566] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.file_backed_memory = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.116732] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.gid_maps = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.116895] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.hw_disk_discard = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.117069] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.hw_machine_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.117245] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_rbd_ceph_conf = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.117411] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.117579] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.117747] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_rbd_glance_store_name = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.117945] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_rbd_pool = rbd {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.118149] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_type = default {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.118319] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.images_volume_group = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.118485] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.inject_key = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.118652] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.inject_partition = -2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.118815] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.inject_password = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.119011] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.iscsi_iface = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.119196] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.iser_use_multipath = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.119367] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.119534] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.119701] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_downtime = 500 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.119865] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.120046] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.120217] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_inbound_addr = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.120385] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.120547] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.120708] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_scheme = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.120880] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_timeout_action = abort {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.121055] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_tunnelled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.121223] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_uri = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.121388] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.live_migration_with_native_tls = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.121544] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.max_queues = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.121705] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.121865] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.nfs_mount_options = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.122206] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.122457] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.122705] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.122892] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.123078] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.123246] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.num_pcie_ports = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.123415] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.123582] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.pmem_namespaces = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.123743] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.quobyte_client_cfg = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.124047] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.124225] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.124394] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.124558] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.124719] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rbd_secret_uuid = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.124879] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rbd_user = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.125088] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.125280] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.125443] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rescue_image_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.125603] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rescue_kernel_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.125763] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rescue_ramdisk_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.125954] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.126152] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.rx_queue_size = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.126323] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.smbfs_mount_options = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.126601] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.126777] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.snapshot_compression = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.126939] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.snapshot_image_format = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.127175] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.127342] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.sparse_logical_volumes = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.127504] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.swtpm_enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.127672] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.swtpm_group = tss {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.127869] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.swtpm_user = tss {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.128084] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.sysinfo_serial = unique {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.128263] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.tb_cache_size = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.128481] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.tx_queue_size = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.128593] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.uid_maps = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.128758] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.use_virtio_for_bridges = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.128932] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.virt_type = kvm {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.129119] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.volume_clear = zero {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.129287] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.volume_clear_size = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.129453] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.volume_use_multipath = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.129614] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_cache_path = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.129782] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.129949] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.130128] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.130299] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.130573] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.130753] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.vzstorage_mount_user = stack {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.130920] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.131110] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.131289] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.auth_type = password {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.131451] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.131611] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.131773] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.131929] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.132101] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.132275] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.default_floating_pool = public {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.132435] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.132597] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.extension_sync_interval = 600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.132757] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.http_retries = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.132917] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.133087] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.133254] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.133425] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.133583] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.133751] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.ovs_bridge = br-int {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.133914] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.physnets = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.134098] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.region_name = RegionOne {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.134273] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.service_metadata_proxy = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.134437] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.134606] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.service_type = network {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.134768] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.134926] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.135100] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.135262] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.135440] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.135604] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] neutron.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.135776] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] notifications.bdms_in_notifications = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.135975] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] notifications.default_level = INFO {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.136178] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] notifications.notification_format = unversioned {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.136348] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] notifications.notify_on_state_change = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.136525] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.136700] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] pci.alias = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.136869] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] pci.device_spec = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.137048] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] pci.report_in_placement = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.137225] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.137398] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.auth_type = password {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.137565] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.137727] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.137904] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.138089] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.138252] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.138412] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.138572] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.default_domain_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.138730] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.default_domain_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.138891] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.domain_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.139060] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.domain_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.139225] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.139387] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.139547] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.139708] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.139868] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.140085] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.password = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.140267] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.project_domain_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.140440] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.project_domain_name = Default {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.140610] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.project_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.140784] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.project_name = service {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.140957] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.region_name = RegionOne {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.141137] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.141309] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.service_type = placement {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.141473] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.141630] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.141788] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.141949] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.system_scope = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.142126] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.142293] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.trust_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.142455] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.user_domain_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.142626] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.user_domain_name = Default {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.142787] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.user_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.142961] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.username = placement {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.143162] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.143332] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] placement.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.143512] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.cores = 20 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.143677] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.count_usage_from_placement = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.143850] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.144032] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.injected_file_content_bytes = 10240 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.144206] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.injected_file_path_length = 255 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.144373] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.injected_files = 5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.144542] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.instances = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.144710] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.key_pairs = 100 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.144877] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.metadata_items = 128 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.145054] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.ram = 51200 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.145222] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.recheck_quota = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.145388] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.server_group_members = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.145555] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] quota.server_groups = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.145723] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rdp.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.146092] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.146294] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.146469] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.146638] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.image_metadata_prefilter = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.146803] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.146973] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.max_attempts = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.147157] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.max_placement_results = 1000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.147328] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.147497] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.147662] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.147841] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] scheduler.workers = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.148054] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.148241] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.148423] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.148598] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.148768] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.148970] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.149195] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.149395] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.149570] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.host_subset_size = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.149739] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.149906] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.150090] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.150266] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.isolated_hosts = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.150432] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.isolated_images = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.150598] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.150762] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.150929] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.151111] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.pci_in_placement = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.151279] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.151444] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.151612] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.151833] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.151936] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.152154] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.152338] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.track_instance_changes = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.152520] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.152697] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metrics.required = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.152868] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metrics.weight_multiplier = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.153048] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.153224] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] metrics.weight_setting = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.153529] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.153708] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] serial_console.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.153891] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] serial_console.port_range = 10000:20000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.154829] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.154829] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.154829] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] serial_console.serialproxy_port = 6083 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.154829] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.154829] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.auth_type = password {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.154984] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.155131] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.155299] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.155462] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.155622] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.155810] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.send_service_user_token = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.155998] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.156182] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] service_user.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.156357] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.agent_enabled = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.156523] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.156822] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.157029] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.157210] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.html5proxy_port = 6082 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.157377] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.image_compression = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.157540] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.jpeg_compression = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.157702] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.playback_compression = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.157906] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.server_listen = 127.0.0.1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.158093] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.158263] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.streaming_mode = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.158424] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] spice.zlib_compression = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.158597] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] upgrade_levels.baseapi = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.158759] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] upgrade_levels.cert = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.158953] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] upgrade_levels.compute = auto {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.159144] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] upgrade_levels.conductor = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.159310] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] upgrade_levels.scheduler = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.159481] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.159648] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.159810] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.159972] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.160155] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.160321] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.160481] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.160645] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.160805] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vendordata_dynamic_auth.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.160981] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.api_retry_count = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.161162] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.ca_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.161337] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.161507] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.cluster_name = testcl1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.161675] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.connection_pool_size = 10 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.161837] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.console_delay_seconds = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.162023] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.datastore_regex = ^datastore.* {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.162237] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.162413] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.host_password = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.162579] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.host_port = 443 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.162748] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.host_username = administrator@vsphere.local {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.162917] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.insecure = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.163097] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.integration_bridge = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.163267] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.maximum_objects = 100 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.163432] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.pbm_default_policy = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.163596] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.pbm_enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.163754] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.pbm_wsdl_location = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.163926] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.164111] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.serial_port_proxy_uri = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.164278] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.serial_port_service_uri = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.164448] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.task_poll_interval = 0.5 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.164620] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.use_linked_clone = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.164789] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.vnc_keymap = en-us {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.164954] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.vnc_port = 5900 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.165136] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vmware.vnc_port_total = 10000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.165325] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.auth_schemes = ['none'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.165502] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.165798] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.165988] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.166180] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.novncproxy_port = 6080 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.166358] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.server_listen = 127.0.0.1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.166535] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.166700] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.vencrypt_ca_certs = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.166861] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.vencrypt_client_cert = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.167035] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vnc.vencrypt_client_key = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.167215] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.167382] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.disable_deep_image_inspection = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.167547] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.167711] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.167907] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.168089] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.disable_rootwrap = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.168260] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.enable_numa_live_migration = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.168426] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.168589] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.168751] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.168951] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.libvirt_disable_apic = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.169152] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.169322] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.169486] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.169651] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.169814] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.169977] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.170157] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.170321] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.170482] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.170650] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.170835] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.171035] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.client_socket_timeout = 900 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.171224] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.default_pool_size = 1000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.171397] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.keep_alive = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.171566] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.max_header_line = 16384 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.171733] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.171897] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.ssl_ca_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.172076] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.ssl_cert_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.172242] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.ssl_key_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.172410] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.tcp_keepidle = 600 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.172588] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.172759] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] zvm.ca_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.172924] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] zvm.cloud_connector_url = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.173227] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.173407] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] zvm.reachable_timeout = 300 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.173592] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.enforce_new_defaults = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.173765] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.enforce_scope = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.173946] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.policy_default_rule = default {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.174149] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.174331] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.policy_file = policy.yaml {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.174507] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.174673] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.174837] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.174998] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.175184] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.175357] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.175536] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.175717] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.connection_string = messaging:// {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.175888] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.enabled = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.176077] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.es_doc_type = notification {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.176250] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.es_scroll_size = 10000 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.176423] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.es_scroll_time = 2m {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.176589] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.filter_error_trace = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.176760] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.hmac_keys = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.176933] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.sentinel_service_name = mymaster {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.177119] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.socket_timeout = 0.1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.177301] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.trace_requests = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.177538] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler.trace_sqlalchemy = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.177786] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler_jaeger.process_tags = {} {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.178039] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler_jaeger.service_name_prefix = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.178281] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] profiler_otlp.service_name_prefix = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.178515] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] remote_debug.host = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.178737] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] remote_debug.port = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.179009] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.179259] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.179518] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.179787] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.179984] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.180175] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.180341] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.180508] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.180672] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.180834] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.181016] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.181197] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.181370] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.181540] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.181704] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.181882] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.182065] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.182237] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.182405] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.182570] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.182735] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.182903] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.183082] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.183248] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.183420] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.183587] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.ssl = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.183762] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.183934] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.184116] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.184294] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.184466] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.184653] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.184826] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_notifications.retry = -1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.185024] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.185212] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.185391] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.auth_section = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.185558] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.auth_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.185721] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.cafile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.185899] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.certfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.186079] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.collect_timing = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.186246] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.connect_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.186410] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.connect_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.186573] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.endpoint_id = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.186733] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.endpoint_override = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.186895] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.insecure = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.187069] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.keyfile = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.187237] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.max_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.187399] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.min_version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.187564] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.region_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.187724] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.service_name = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.187913] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.service_type = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.188098] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.split_loggers = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.188265] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.status_code_retries = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.188430] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.status_code_retry_delay = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.188591] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.timeout = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.188751] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.valid_interfaces = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.188932] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_limit.version = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.189128] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_reports.file_event_handler = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.189302] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.189465] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] oslo_reports.log_dir = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.189641] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.189805] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.189969] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.190155] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.190322] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.190484] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.190657] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.190818] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_ovs_privileged.group = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.190979] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.191166] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.191333] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.191497] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] vif_plug_ovs_privileged.user = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.191670] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.191852] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.192059] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.192249] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.192426] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.192596] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.192764] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.192930] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.193128] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.193305] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.isolate_vif = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.193475] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.193644] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.193815] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.193985] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.194168] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_vif_ovs.per_port_bridge = False {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.194336] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_brick.lock_path = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.194502] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.194667] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.194838] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] privsep_osbrick.capabilities = [21] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.194998] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] privsep_osbrick.group = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.195177] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] privsep_osbrick.helper_command = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.195345] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.195509] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.195669] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] privsep_osbrick.user = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.195868] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.196015] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] nova_sys_admin.group = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.196181] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] nova_sys_admin.helper_command = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.196349] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.196513] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.196671] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] nova_sys_admin.user = None {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 472.196801] env[60788]: DEBUG oslo_service.service [None req-3fc63e29-798e-449d-8d41-28463b791f98 None None] ******************************************************************************** {{(pid=60788) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 472.197231] env[60788]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 472.207187] env[60788]: WARNING nova.virt.vmwareapi.driver [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 472.207651] env[60788]: INFO nova.virt.node [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Generated node identity 75623588-d529-4955-b0d7-8c3260d605e7 [ 472.207900] env[60788]: INFO nova.virt.node [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Wrote node identity 75623588-d529-4955-b0d7-8c3260d605e7 to /opt/stack/data/n-cpu-1/compute_id [ 472.220904] env[60788]: WARNING nova.compute.manager [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Compute nodes ['75623588-d529-4955-b0d7-8c3260d605e7'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 472.253928] env[60788]: INFO nova.compute.manager [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 472.274015] env[60788]: WARNING nova.compute.manager [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 472.274271] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 472.274744] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 472.274744] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 472.274861] env[60788]: DEBUG nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 472.275985] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-072b7d83-77c1-459f-939a-42fadf4ea558 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.285175] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e8c768-ef96-4e91-800e-95fa1d46d1c5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.298897] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8d1888e-bcc9-45c3-8e1c-8f3ca1b36952 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.305079] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e9fe36f-3266-4391-bd32-f945784ac876 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.334835] env[60788]: DEBUG nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181250MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 472.334961] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 472.335140] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 472.346373] env[60788]: WARNING nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] No compute node record for cpu-1:75623588-d529-4955-b0d7-8c3260d605e7: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 75623588-d529-4955-b0d7-8c3260d605e7 could not be found. [ 472.359410] env[60788]: INFO nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 75623588-d529-4955-b0d7-8c3260d605e7 [ 472.409290] env[60788]: DEBUG nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 472.409458] env[60788]: DEBUG nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 472.515350] env[60788]: INFO nova.scheduler.client.report [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] [req-9f1b7a7d-3208-4268-946a-43f73e990093] Created resource provider record via placement API for resource provider with UUID 75623588-d529-4955-b0d7-8c3260d605e7 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 472.533149] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40e83e41-b123-40fc-86a9-fc5be556484a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.541301] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cefd859c-e40f-4d19-b98e-4e77cc901d96 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.570618] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95f5c927-1c52-43bf-865e-87862918fb41 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.577487] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a523f829-fb4e-4be1-8b08-2be0a7ae968f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 472.590232] env[60788]: DEBUG nova.compute.provider_tree [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 472.628150] env[60788]: DEBUG nova.scheduler.client.report [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Updated inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 472.628350] env[60788]: DEBUG nova.compute.provider_tree [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Updating resource provider 75623588-d529-4955-b0d7-8c3260d605e7 generation from 0 to 1 during operation: update_inventory {{(pid=60788) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 472.628495] env[60788]: DEBUG nova.compute.provider_tree [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 472.675009] env[60788]: DEBUG nova.compute.provider_tree [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Updating resource provider 75623588-d529-4955-b0d7-8c3260d605e7 generation from 1 to 2 during operation: update_traits {{(pid=60788) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 472.691692] env[60788]: DEBUG nova.compute.resource_tracker [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 472.691875] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 472.692059] env[60788]: DEBUG nova.service [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Creating RPC server for service compute {{(pid=60788) start /opt/stack/nova/nova/service.py:182}} [ 472.704867] env[60788]: DEBUG nova.service [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] Join ServiceGroup membership for this service compute {{(pid=60788) start /opt/stack/nova/nova/service.py:199}} [ 472.704867] env[60788]: DEBUG nova.servicegroup.drivers.db [None req-d4e80d6d-649e-4198-bb3b-93a6bc440454 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60788) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 475.708557] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 475.721037] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Getting list of instances from cluster (obj){ [ 475.721037] env[60788]: value = "domain-c8" [ 475.721037] env[60788]: _type = "ClusterComputeResource" [ 475.721037] env[60788]: } {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 475.721037] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ef6a67c-4bde-4ee9-9371-42dd3cfe799a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 475.729280] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Got total of 0 instances {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 475.729618] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 475.730026] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Getting list of instances from cluster (obj){ [ 475.730026] env[60788]: value = "domain-c8" [ 475.730026] env[60788]: _type = "ClusterComputeResource" [ 475.730026] env[60788]: } {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 475.731130] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cd70bcf-3cb3-49a7-bec0-a7e02f323adc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 475.738785] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Got total of 0 instances {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 482.031500] env[60788]: DEBUG dbcounter [-] [60788] Writing DB stats nova_cell1:SELECT=1 {{(pid=60788) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 482.032237] env[60788]: DEBUG dbcounter [-] [60788] Writing DB stats nova_cell0:SELECT=1 {{(pid=60788) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 519.720979] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "f88189b2-070f-4529-af1b-67c8d9b271a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 519.722012] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 519.748954] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 519.912146] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 519.912453] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 519.916513] env[60788]: INFO nova.compute.claims [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 520.088779] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e240cbd-996b-4f3c-b6f7-6a902a97b468 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.101325] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05592616-d049-47c6-8a6e-e54b29ddab1f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.150413] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a298b1e-72ab-4f63-9cb1-153b13c0dfde {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.160136] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39284b6c-4816-429b-8d4b-beb15127bb76 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.177247] env[60788]: DEBUG nova.compute.provider_tree [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 520.202129] env[60788]: DEBUG nova.scheduler.client.report [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 520.229024] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 520.229982] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 520.302579] env[60788]: DEBUG nova.compute.utils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 520.305130] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 520.305367] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 520.341380] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquiring lock "1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 520.341380] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Lock "1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 520.342570] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 520.392066] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 520.392356] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Lock "1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 520.394842] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 520.411021] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 520.501988] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 520.502340] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 520.505427] env[60788]: INFO nova.compute.claims [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 520.546742] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 520.551386] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 520.683515] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04449c42-b373-4cf7-ae66-0f9a9b1a4291 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.693146] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-711cfc76-cd82-4fb7-9ac3-10fe1cbbd8c3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.728023] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-879eb029-d182-4907-bc10-380198534447 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.737053] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af2dc32c-b182-4bf3-9fc4-d26d1b6e33ab {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.752762] env[60788]: DEBUG nova.compute.provider_tree [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 520.764566] env[60788]: DEBUG nova.scheduler.client.report [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 520.784130] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 520.784695] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 520.787958] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.237s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 520.791868] env[60788]: INFO nova.compute.claims [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 520.838049] env[60788]: DEBUG nova.compute.utils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 520.839930] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Not allocating networking since 'none' was specified. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 520.862908] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 520.919202] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94a45fff-c1c5-4fae-8126-c2b6280b7dd9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.931942] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1225799d-b6a5-456f-a013-8afec0af1ba6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.969088] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 520.970577] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40b976d8-00f7-43c5-8c1c-2fcb7dd263f3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.979287] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c96c363f-c8fc-4336-8a15-1a77442016da {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 520.994407] env[60788]: DEBUG nova.compute.provider_tree [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 521.008227] env[60788]: DEBUG nova.scheduler.client.report [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 521.030584] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 521.031097] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 521.070872] env[60788]: DEBUG nova.compute.utils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 521.072674] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 521.072797] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 521.087751] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 521.182730] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 521.324809] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 521.325146] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 521.325358] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 521.326223] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 521.326369] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 521.326512] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 521.326740] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 521.326897] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 521.327631] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 521.328526] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 521.328526] env[60788]: DEBUG nova.virt.hardware [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 521.332193] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4df63ca-f516-42d3-b113-a05c1861d537 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.349060] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f6b2436-d47f-493a-8610-19e6231da224 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.361405] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 521.361605] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 521.361757] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 521.361944] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 521.362286] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 521.362426] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 521.362694] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 521.364907] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 521.364907] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 521.364907] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 521.364907] env[60788]: DEBUG nova.virt.hardware [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 521.367843] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-448c0421-4de3-48e8-b327-47a9d157f538 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.395062] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 521.395336] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 521.395496] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 521.395677] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 521.395833] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 521.395984] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 521.396514] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 521.396789] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 521.400464] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 521.400464] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 521.400464] env[60788]: DEBUG nova.virt.hardware [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 521.408430] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed384cc6-c97a-4295-8cd5-7ae185a0f4cd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.426919] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-287f65be-37b2-4f68-9c3d-fc5090ab6357 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.443165] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2822e36-4a9c-41a3-a41d-2023aacd8944 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.449522] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Instance VIF info [] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 521.459803] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 521.461652] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-faa5ce76-bce2-4864-a804-27274b513675 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.476890] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15358b18-cc46-48e8-a886-a0b7239edc68 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.485517] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Created folder: OpenStack in parent group-v4. [ 521.485735] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Creating folder: Project (8ac1877523544a17b83b788e9893c942). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 521.497026] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-18ad814e-1797-469e-a37d-e8f208b4117c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.507932] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Created folder: Project (8ac1877523544a17b83b788e9893c942) in parent group-v449747. [ 521.508814] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Creating folder: Instances. Parent ref: group-v449748. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 521.510642] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-42003155-385c-49ba-89a3-66c4a556e833 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.520406] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Created folder: Instances in parent group-v449748. [ 521.520406] env[60788]: DEBUG oslo.service.loopingcall [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 521.520805] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 521.520805] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-23c669c8-f86f-4b4d-ad02-b46db418dbbb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.544305] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 521.544305] env[60788]: value = "task-2205094" [ 521.544305] env[60788]: _type = "Task" [ 521.544305] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 521.554541] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205094, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 521.626161] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquiring lock "175d889c-2151-4336-920f-db9a54253946" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 521.626161] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Lock "175d889c-2151-4336-920f-db9a54253946" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 521.658895] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 521.744545] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 521.744545] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 521.745559] env[60788]: INFO nova.compute.claims [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 521.900481] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5624d9d9-6655-4dfc-88c1-fa6217b0c002 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.910778] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-146a7f04-3345-4459-b6bd-33d861d7372e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.944605] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8698a2ba-9d6a-4c2c-9625-f73587c184fb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.952977] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e09eb28-632c-44a4-8ae6-b8b0d9b7a819 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 521.968064] env[60788]: DEBUG nova.compute.provider_tree [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 521.987217] env[60788]: DEBUG nova.scheduler.client.report [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 522.001223] env[60788]: DEBUG nova.policy [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2622e7e3d8424bcb8dc24406bff81ac1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e5e9ec9d68c04b37810fae19866f3a0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 522.007036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 522.008639] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 522.038628] env[60788]: DEBUG nova.policy [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '528220eb929141aa895c3a1878f774c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3becc06e3ba4cae8bcef73ba4a36050', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 522.060448] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205094, 'name': CreateVM_Task} progress is 99%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 522.067021] env[60788]: DEBUG nova.compute.utils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 522.070540] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 522.071091] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 522.077878] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 522.183388] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 522.215895] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 522.216402] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 522.216746] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 522.217188] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 522.217431] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 522.217890] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 522.218221] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 522.218520] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 522.218811] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 522.219087] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 522.219474] env[60788]: DEBUG nova.virt.hardware [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 522.221040] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b760d0d-bc57-4c16-bd03-5e5504eb0197 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 522.231886] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f2b4658-a7e5-4910-a965-1200d610060b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 522.389260] env[60788]: DEBUG nova.policy [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de56d66c05a1483e931389e00aa42249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab4a6671f8254468aefa43fe62bb8ec9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 522.562905] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205094, 'name': CreateVM_Task, 'duration_secs': 0.575478} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 522.563139] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 522.564243] env[60788]: DEBUG oslo_vmware.service [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdae9f9f-96f3-4dd5-886b-b342516bc651 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 522.577451] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 522.577639] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 522.578759] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 522.578759] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4a41bedf-a2fe-4367-b2ed-0c074fbed414 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 522.586765] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Waiting for the task: (returnval){ [ 522.586765] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52872653-bc66-10bb-2382-b2530d40740d" [ 522.586765] env[60788]: _type = "Task" [ 522.586765] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 522.598066] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52872653-bc66-10bb-2382-b2530d40740d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 522.828515] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 522.828959] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 522.841645] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 522.910716] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 522.910950] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 522.912721] env[60788]: INFO nova.compute.claims [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 523.085756] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13fbf748-30d1-4bbd-acc7-28e0f71a800e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.105282] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f650a3b-b87e-4265-b17e-7e8e905392b4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.108897] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 523.110108] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 523.110108] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 523.110108] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 523.110108] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 523.110446] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2efa0758-2a18-405a-bfc3-79d486015cbe {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.137764] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3bb8728-7a7b-4846-a723-fa1803fa5610 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.144957] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5292279e-2c30-401e-9d3c-c4c80ffddd4d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.159317] env[60788]: DEBUG nova.compute.provider_tree [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 523.163727] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 523.163727] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 523.163727] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab25aea8-6179-49c5-8c9e-0e69ea27a826 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.169565] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4070115e-0782-4c95-a770-4994466bc0c0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.172262] env[60788]: DEBUG nova.scheduler.client.report [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 523.178934] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Waiting for the task: (returnval){ [ 523.178934] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525ee922-9664-9d29-72f0-c418cd93f04e" [ 523.178934] env[60788]: _type = "Task" [ 523.178934] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 523.186294] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]525ee922-9664-9d29-72f0-c418cd93f04e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 523.194305] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 523.194645] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 523.240475] env[60788]: DEBUG nova.compute.utils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 523.242241] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 523.242320] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 523.260634] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 523.375476] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 523.417374] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 523.417374] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 523.417374] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 523.417612] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 523.417612] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 523.417682] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 523.418149] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 523.418149] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 523.418252] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 523.418338] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 523.418501] env[60788]: DEBUG nova.virt.hardware [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 523.419795] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6654f01-0447-4313-b2db-39905d6f0d76 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.429658] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b322cc82-27ff-45ed-b174-7709335d8759 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.694952] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 523.694952] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Creating directory with path [datastore2] vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 523.694952] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0fc99f2-84b8-4c9d-8b5d-61131c2b6cbf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.718067] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Created directory with path [datastore2] vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 523.718299] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Fetch image to [datastore2] vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 523.718463] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 523.719331] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b201d926-c598-4687-9bec-5ecb239736fc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.730707] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c1bc22-2018-4073-a95e-abce4e485844 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.740696] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f1c8b61-12da-49ab-9269-e5124f06bb26 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.776126] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-130808b1-a9f2-4590-bf65-a11987964e86 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.788061] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-07487d41-4177-4143-8eed-30a4a04b94f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 523.819020] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 523.819399] env[60788]: DEBUG nova.policy [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '781bfaf36d664710b2d9aa8932475f61', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c43107aa74c74077983b1f53231772bc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 523.836572] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Successfully created port: f299dfa4-7060-466f-b066-ddb11b0f4faf {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 523.900301] env[60788]: DEBUG oslo_vmware.rw_handles [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 523.978301] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Successfully created port: 0760d543-6bb2-4b26-a28e-fc6eb0713565 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 523.986020] env[60788]: DEBUG oslo_vmware.rw_handles [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 523.986235] env[60788]: DEBUG oslo_vmware.rw_handles [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 524.120158] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Successfully created port: db8115b3-50ec-462d-998b-ef0aedd79dc1 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 524.529835] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 524.530485] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 524.547312] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 524.643740] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 524.643994] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 524.647429] env[60788]: INFO nova.compute.claims [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 524.860904] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e3c599-2c16-4d7c-a88f-c1ba82c64802 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 524.870444] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c2eb934-0b24-41b0-bdb8-bbc5093e5732 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 524.912128] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96b3bc75-91a8-4d6b-baeb-f3280afccd5e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 524.921760] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b6c1a86-b68c-45c4-931b-45867ea467e7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 524.939282] env[60788]: DEBUG nova.compute.provider_tree [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 524.953437] env[60788]: DEBUG nova.scheduler.client.report [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 524.978874] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 524.979447] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 525.023752] env[60788]: DEBUG nova.compute.utils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 525.029027] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 525.029027] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 525.043530] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 525.138064] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 525.171590] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 525.171811] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 525.172079] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 525.175025] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 525.175924] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 525.175924] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 525.175924] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 525.175924] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 525.177542] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 525.177944] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 525.178080] env[60788]: DEBUG nova.virt.hardware [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 525.179391] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90607c65-3b62-4a3b-9db7-ecc3d9f25e20 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.190773] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d195339-e935-4418-9de8-30b5eb82665b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 525.694620] env[60788]: DEBUG nova.policy [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48c3dc11379940f0b4f71ef5b0657225', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c1c52959b8554316975b8a38175325b3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 526.170114] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "fb58b00e-1a78-4750-b912-48c94144ea66" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.170415] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "fb58b00e-1a78-4750-b912-48c94144ea66" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.189738] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 526.268781] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 526.268781] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 526.271709] env[60788]: INFO nova.compute.claims [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 526.475515] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Successfully created port: a959fb32-642f-4e1a-a2a2-4585ed732da1 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 526.494704] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c2ec08f-665d-4ded-b10e-ed53238dc930 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.502310] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-042aed77-3656-4399-9ead-1929ce03af9f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.538431] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c9381aa-eddf-4c34-869e-80ae477c21b6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.546922] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3babcdc7-60fc-4c17-9fb5-5a2a30d1271f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.562315] env[60788]: DEBUG nova.compute.provider_tree [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 526.582901] env[60788]: DEBUG nova.scheduler.client.report [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 526.600571] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 526.601540] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 526.656600] env[60788]: DEBUG nova.compute.utils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 526.659066] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 526.659406] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 526.677735] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 526.799913] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 526.838025] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 526.838025] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 526.838025] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 526.838518] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 526.838518] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 526.838518] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 526.838700] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 526.838735] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 526.838931] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 526.839045] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 526.839389] env[60788]: DEBUG nova.virt.hardware [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 526.840197] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c81c9d93-a613-48b8-a04b-b9e8ec3019db {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 526.849799] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-323e6003-19a2-4248-a184-160a3baf67b9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 527.149710] env[60788]: DEBUG nova.policy [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0731d77f52124d17abd68ed146a0c60d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d0c1e93824f41b7845970fd9b98241c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 528.765415] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.765637] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.765637] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 528.765767] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 528.790382] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.790382] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.790382] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.790498] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 175d889c-2151-4336-920f-db9a54253946] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.791512] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.791512] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.791512] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 528.791512] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 528.795171] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.795171] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.795171] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.795404] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.795569] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.795865] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.795865] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 528.796095] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 528.803553] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Successfully updated port: f299dfa4-7060-466f-b066-ddb11b0f4faf {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 528.824286] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 528.824676] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 528.825632] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 528.825797] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 528.827207] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-979690a5-a0b8-44c6-9f33-f3c3ab2a4404 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 528.832106] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquiring lock "refresh_cache-175d889c-2151-4336-920f-db9a54253946" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 528.832269] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquired lock "refresh_cache-175d889c-2151-4336-920f-db9a54253946" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 528.832887] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 528.844520] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccef3dea-cdea-4d8c-8e2d-867910b2d05f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 528.866741] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fe3e391-ebee-42f4-a5b6-decb2ebce9b2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 528.877043] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2da051cb-c697-4717-a533-12c49eb846f7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 528.913816] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181239MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 528.913986] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 528.914214] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 529.048268] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f88189b2-070f-4529-af1b-67c8d9b271a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.048564] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.048564] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.048993] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 175d889c-2151-4336-920f-db9a54253946 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.048993] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.048993] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.049266] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb58b00e-1a78-4750-b912-48c94144ea66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 529.049266] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 529.049462] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 529.094520] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Successfully updated port: db8115b3-50ec-462d-998b-ef0aedd79dc1 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 529.109195] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquiring lock "refresh_cache-1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 529.109195] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquired lock "refresh_cache-1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 529.109195] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 529.199022] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 529.245312] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94fed608-044f-4ee7-88e0-476e6caac04c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 529.255129] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ac6dbb-636a-4527-a0d7-fd70eb03496a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 529.297807] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-074d1185-93e5-43e1-b8b6-e91a972871ae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 529.308439] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c664126-91a6-4d71-b7b8-1029e9720ab1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 529.326238] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 529.339294] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 529.369185] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 529.369393] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.455s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 529.533589] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Successfully updated port: 0760d543-6bb2-4b26-a28e-fc6eb0713565 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 529.539665] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Successfully created port: dc073daa-4854-4964-b072-111ec8dde874 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 529.549699] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "refresh_cache-f88189b2-070f-4529-af1b-67c8d9b271a8" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 529.549856] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired lock "refresh_cache-f88189b2-070f-4529-af1b-67c8d9b271a8" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 529.550017] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 529.563740] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 529.785627] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 529.975374] env[60788]: DEBUG nova.compute.manager [req-921f658c-b1ab-40b6-b9de-4cf4d22ba71a req-b254f290-facd-4dda-9434-49f22018247b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Received event network-vif-plugged-db8115b3-50ec-462d-998b-ef0aedd79dc1 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 529.975616] env[60788]: DEBUG oslo_concurrency.lockutils [req-921f658c-b1ab-40b6-b9de-4cf4d22ba71a req-b254f290-facd-4dda-9434-49f22018247b service nova] Acquiring lock "1091a5ac-788a-4a8b-8f29-ad766fe5ffa2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 529.975823] env[60788]: DEBUG oslo_concurrency.lockutils [req-921f658c-b1ab-40b6-b9de-4cf4d22ba71a req-b254f290-facd-4dda-9434-49f22018247b service nova] Lock "1091a5ac-788a-4a8b-8f29-ad766fe5ffa2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 529.975986] env[60788]: DEBUG oslo_concurrency.lockutils [req-921f658c-b1ab-40b6-b9de-4cf4d22ba71a req-b254f290-facd-4dda-9434-49f22018247b service nova] Lock "1091a5ac-788a-4a8b-8f29-ad766fe5ffa2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 529.976171] env[60788]: DEBUG nova.compute.manager [req-921f658c-b1ab-40b6-b9de-4cf4d22ba71a req-b254f290-facd-4dda-9434-49f22018247b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] No waiting events found dispatching network-vif-plugged-db8115b3-50ec-462d-998b-ef0aedd79dc1 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 529.976344] env[60788]: WARNING nova.compute.manager [req-921f658c-b1ab-40b6-b9de-4cf4d22ba71a req-b254f290-facd-4dda-9434-49f22018247b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Received unexpected event network-vif-plugged-db8115b3-50ec-462d-998b-ef0aedd79dc1 for instance with vm_state building and task_state spawning. [ 530.036886] env[60788]: DEBUG nova.compute.manager [req-e2f24203-f3fe-49ab-89ef-fbd4ac8927e3 req-8591db26-0330-45f6-a6bb-3032adb97504 service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Received event network-vif-plugged-f299dfa4-7060-466f-b066-ddb11b0f4faf {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 530.037669] env[60788]: DEBUG oslo_concurrency.lockutils [req-e2f24203-f3fe-49ab-89ef-fbd4ac8927e3 req-8591db26-0330-45f6-a6bb-3032adb97504 service nova] Acquiring lock "175d889c-2151-4336-920f-db9a54253946-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 530.037972] env[60788]: DEBUG oslo_concurrency.lockutils [req-e2f24203-f3fe-49ab-89ef-fbd4ac8927e3 req-8591db26-0330-45f6-a6bb-3032adb97504 service nova] Lock "175d889c-2151-4336-920f-db9a54253946-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 530.038132] env[60788]: DEBUG oslo_concurrency.lockutils [req-e2f24203-f3fe-49ab-89ef-fbd4ac8927e3 req-8591db26-0330-45f6-a6bb-3032adb97504 service nova] Lock "175d889c-2151-4336-920f-db9a54253946-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 530.038325] env[60788]: DEBUG nova.compute.manager [req-e2f24203-f3fe-49ab-89ef-fbd4ac8927e3 req-8591db26-0330-45f6-a6bb-3032adb97504 service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] No waiting events found dispatching network-vif-plugged-f299dfa4-7060-466f-b066-ddb11b0f4faf {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 530.038492] env[60788]: WARNING nova.compute.manager [req-e2f24203-f3fe-49ab-89ef-fbd4ac8927e3 req-8591db26-0330-45f6-a6bb-3032adb97504 service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Received unexpected event network-vif-plugged-f299dfa4-7060-466f-b066-ddb11b0f4faf for instance with vm_state building and task_state spawning. [ 530.678942] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Updating instance_info_cache with network_info: [{"id": "f299dfa4-7060-466f-b066-ddb11b0f4faf", "address": "fa:16:3e:b5:55:a4", "network": {"id": "3f7667dd-14bb-45f3-8565-c5f043c1dd46", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1035315435-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab4a6671f8254468aefa43fe62bb8ec9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e6db039c-542c-4544-a57d-ddcc6c1e8e45", "external-id": "nsx-vlan-transportzone-810", "segmentation_id": 810, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf299dfa4-70", "ovs_interfaceid": "f299dfa4-7060-466f-b066-ddb11b0f4faf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 530.700207] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Releasing lock "refresh_cache-175d889c-2151-4336-920f-db9a54253946" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 530.704395] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Instance network_info: |[{"id": "f299dfa4-7060-466f-b066-ddb11b0f4faf", "address": "fa:16:3e:b5:55:a4", "network": {"id": "3f7667dd-14bb-45f3-8565-c5f043c1dd46", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1035315435-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab4a6671f8254468aefa43fe62bb8ec9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e6db039c-542c-4544-a57d-ddcc6c1e8e45", "external-id": "nsx-vlan-transportzone-810", "segmentation_id": 810, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf299dfa4-70", "ovs_interfaceid": "f299dfa4-7060-466f-b066-ddb11b0f4faf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 530.704621] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b5:55:a4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e6db039c-542c-4544-a57d-ddcc6c1e8e45', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f299dfa4-7060-466f-b066-ddb11b0f4faf', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 530.713114] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Creating folder: Project (ab4a6671f8254468aefa43fe62bb8ec9). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 530.714877] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Successfully created port: 98e811dc-7dd0-460f-ae20-e5cdab2bae31 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 530.717146] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bca50aa8-714a-4e7b-8403-9809a07be10a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 530.732427] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Created folder: Project (ab4a6671f8254468aefa43fe62bb8ec9) in parent group-v449747. [ 530.733039] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Creating folder: Instances. Parent ref: group-v449751. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 530.733039] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd7ed63b-f11b-4beb-ac02-a5c5e68e8570 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 530.741983] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Created folder: Instances in parent group-v449751. [ 530.742148] env[60788]: DEBUG oslo.service.loopingcall [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 530.742321] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 175d889c-2151-4336-920f-db9a54253946] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 530.744365] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-71633521-d76f-46ec-8b5a-8e6fa988bf38 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 530.766656] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 530.766656] env[60788]: value = "task-2205097" [ 530.766656] env[60788]: _type = "Task" [ 530.766656] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 530.778219] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205097, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 530.835970] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Updating instance_info_cache with network_info: [{"id": "db8115b3-50ec-462d-998b-ef0aedd79dc1", "address": "fa:16:3e:33:6a:63", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.63", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb8115b3-50", "ovs_interfaceid": "db8115b3-50ec-462d-998b-ef0aedd79dc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 530.856906] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Releasing lock "refresh_cache-1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 530.857295] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Instance network_info: |[{"id": "db8115b3-50ec-462d-998b-ef0aedd79dc1", "address": "fa:16:3e:33:6a:63", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.63", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb8115b3-50", "ovs_interfaceid": "db8115b3-50ec-462d-998b-ef0aedd79dc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 530.857704] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:6a:63', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'db8115b3-50ec-462d-998b-ef0aedd79dc1', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 530.868799] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Creating folder: Project (e3becc06e3ba4cae8bcef73ba4a36050). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 530.870459] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8996a4b9-c9fe-40d7-911e-0c0985551122 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 530.886350] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Created folder: Project (e3becc06e3ba4cae8bcef73ba4a36050) in parent group-v449747. [ 530.886350] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Creating folder: Instances. Parent ref: group-v449754. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 530.886350] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-22a55f7c-ba81-4965-b8f8-c92f0c104fb8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 530.896628] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Created folder: Instances in parent group-v449754. [ 530.897345] env[60788]: DEBUG oslo.service.loopingcall [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 530.897776] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 530.898202] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-86d58c69-8f39-45cf-87c4-0ca12a556445 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 530.924934] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 530.924934] env[60788]: value = "task-2205100" [ 530.924934] env[60788]: _type = "Task" [ 530.924934] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 530.937127] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205100, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 530.971631] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Updating instance_info_cache with network_info: [{"id": "0760d543-6bb2-4b26-a28e-fc6eb0713565", "address": "fa:16:3e:04:87:a2", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0760d543-6b", "ovs_interfaceid": "0760d543-6bb2-4b26-a28e-fc6eb0713565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 530.996527] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Releasing lock "refresh_cache-f88189b2-070f-4529-af1b-67c8d9b271a8" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 530.997929] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance network_info: |[{"id": "0760d543-6bb2-4b26-a28e-fc6eb0713565", "address": "fa:16:3e:04:87:a2", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0760d543-6b", "ovs_interfaceid": "0760d543-6bb2-4b26-a28e-fc6eb0713565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 530.998144] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:04:87:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0760d543-6bb2-4b26-a28e-fc6eb0713565', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 531.014602] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating folder: Project (e5e9ec9d68c04b37810fae19866f3a0b). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 531.015655] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c3303b20-6586-471a-800a-208abf7af73a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 531.031159] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Created folder: Project (e5e9ec9d68c04b37810fae19866f3a0b) in parent group-v449747. [ 531.031556] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating folder: Instances. Parent ref: group-v449757. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 531.031931] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1c2b0e1e-4fd7-4856-98e5-44344eb0d298 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 531.042138] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Created folder: Instances in parent group-v449757. [ 531.042460] env[60788]: DEBUG oslo.service.loopingcall [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 531.043310] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 531.043310] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-19c32665-6492-4eef-af22-72b4290e5a4b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 531.067409] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 531.067409] env[60788]: value = "task-2205103" [ 531.067409] env[60788]: _type = "Task" [ 531.067409] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 531.080054] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205103, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 531.080054] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Successfully updated port: a959fb32-642f-4e1a-a2a2-4585ed732da1 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 531.100172] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "refresh_cache-e883e763-d7c1-4eae-af6e-4a4e4a84e323" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 531.100540] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquired lock "refresh_cache-e883e763-d7c1-4eae-af6e-4a4e4a84e323" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 531.100640] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 531.287277] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 531.295244] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205097, 'name': CreateVM_Task, 'duration_secs': 0.324313} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 531.295517] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 175d889c-2151-4336-920f-db9a54253946] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 531.317337] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 531.317576] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 531.318072] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 531.318884] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eac98473-ca95-45ad-98b4-a8701ec219b7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 531.327546] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Waiting for the task: (returnval){ [ 531.327546] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5225d866-0ac1-45fb-4b87-56c999b86ff2" [ 531.327546] env[60788]: _type = "Task" [ 531.327546] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 531.338502] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5225d866-0ac1-45fb-4b87-56c999b86ff2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 531.436479] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205100, 'name': CreateVM_Task, 'duration_secs': 0.304004} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 531.436479] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 531.437284] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 531.585264] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205103, 'name': CreateVM_Task, 'duration_secs': 0.328629} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 531.588513] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 531.591414] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 531.843046] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 531.843317] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 531.843611] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 531.843694] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 531.844430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 531.844430] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3b17bc87-ff0e-4775-a4ae-a8d8e1793667 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 531.852425] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Waiting for the task: (returnval){ [ 531.852425] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]529ec376-98ec-ed32-0a06-907072c0511f" [ 531.852425] env[60788]: _type = "Task" [ 531.852425] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 531.861401] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]529ec376-98ec-ed32-0a06-907072c0511f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 532.297889] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Updating instance_info_cache with network_info: [{"id": "a959fb32-642f-4e1a-a2a2-4585ed732da1", "address": "fa:16:3e:b3:1c:97", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa959fb32-64", "ovs_interfaceid": "a959fb32-642f-4e1a-a2a2-4585ed732da1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 532.314757] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Releasing lock "refresh_cache-e883e763-d7c1-4eae-af6e-4a4e4a84e323" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 532.315337] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance network_info: |[{"id": "a959fb32-642f-4e1a-a2a2-4585ed732da1", "address": "fa:16:3e:b3:1c:97", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa959fb32-64", "ovs_interfaceid": "a959fb32-642f-4e1a-a2a2-4585ed732da1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 532.316300] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b3:1c:97', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a959fb32-642f-4e1a-a2a2-4585ed732da1', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 532.326644] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Creating folder: Project (c43107aa74c74077983b1f53231772bc). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 532.328318] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-037258bc-8fb7-4bc7-bf21-5610dc7c5d40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.342155] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Created folder: Project (c43107aa74c74077983b1f53231772bc) in parent group-v449747. [ 532.342393] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Creating folder: Instances. Parent ref: group-v449760. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 532.343428] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0aeac562-2488-4b62-bf1f-925039abc46e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.366337] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Created folder: Instances in parent group-v449760. [ 532.366577] env[60788]: DEBUG oslo.service.loopingcall [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 532.367370] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "331fc548-2076-48e2-a84b-94130a99c2ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 532.367451] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "331fc548-2076-48e2-a84b-94130a99c2ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 532.368632] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 532.372544] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a1fbafbb-f345-4b34-b406-2e43bf1211e2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.392216] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 532.392478] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 532.392683] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 532.392962] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 532.397760] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 532.398503] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 532.399057] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d0f72ff1-f6eb-4109-a16b-52ce2ceb1ba3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.402894] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 532.402894] env[60788]: value = "task-2205106" [ 532.402894] env[60788]: _type = "Task" [ 532.402894] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 532.409138] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 532.409138] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]524f339f-e875-edea-d16f-9095f175c385" [ 532.409138] env[60788]: _type = "Task" [ 532.409138] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 532.424621] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205106, 'name': CreateVM_Task} progress is 5%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 532.432757] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 532.432757] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 532.432757] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 532.501198] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 532.501198] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 532.503275] env[60788]: INFO nova.compute.claims [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 532.745404] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8230c19-67e4-4649-978b-8b281a2e91f3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.754073] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4e8b523-8910-4ce5-90d4-088d88173ca9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.796088] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80033fa6-e30e-42a0-bcc8-1f2848777cdc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.805927] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b5224a8-1fd1-4f27-9a6d-c4479cca9a40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.820043] env[60788]: DEBUG nova.compute.provider_tree [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 532.831150] env[60788]: DEBUG nova.scheduler.client.report [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 532.847997] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 532.850016] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 532.902203] env[60788]: DEBUG nova.compute.utils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 532.904226] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 532.904402] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 532.919452] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205106, 'name': CreateVM_Task, 'duration_secs': 0.299305} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 532.919629] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 532.920343] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 532.920508] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 532.920961] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 532.921234] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6af5ea10-84ec-4d4f-b767-03b40c5fd5f2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 532.924344] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 532.928634] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Successfully updated port: dc073daa-4854-4964-b072-111ec8dde874 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 532.935383] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Waiting for the task: (returnval){ [ 532.935383] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52708f7c-f4eb-4237-da9c-cf596e4456f3" [ 532.935383] env[60788]: _type = "Task" [ 532.935383] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 532.947465] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "refresh_cache-231fcc6a-7ec4-4202-b960-ddc966ef2b9c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 532.947747] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquired lock "refresh_cache-231fcc6a-7ec4-4202-b960-ddc966ef2b9c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 532.947923] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 532.949571] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 532.949796] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 532.950024] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 533.001587] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 533.034996] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 533.035253] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 533.035409] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 533.035595] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 533.035813] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 533.035970] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 533.036198] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 533.036531] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 533.036531] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 533.036921] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 533.036921] env[60788]: DEBUG nova.virt.hardware [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 533.038050] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f7b5da2-adec-4a11-b713-c818ae14bcdd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 533.046337] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fd98aa3-e76c-4228-b2a1-8160d2b6e42d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 533.106158] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 533.261301] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Successfully updated port: 98e811dc-7dd0-460f-ae20-e5cdab2bae31 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 533.273667] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "refresh_cache-fb58b00e-1a78-4750-b912-48c94144ea66" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 533.273875] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquired lock "refresh_cache-fb58b00e-1a78-4750-b912-48c94144ea66" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 533.274067] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 533.923835] env[60788]: DEBUG nova.policy [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2741840be14143ffa6ac189396b78fe3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97fcd4e838a74b9eb0d979fce6f2839f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 533.925939] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 534.623769] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Updating instance_info_cache with network_info: [{"id": "dc073daa-4854-4964-b072-111ec8dde874", "address": "fa:16:3e:aa:e3:e3", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdc073daa-48", "ovs_interfaceid": "dc073daa-4854-4964-b072-111ec8dde874", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 534.641985] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Releasing lock "refresh_cache-231fcc6a-7ec4-4202-b960-ddc966ef2b9c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 534.642555] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance network_info: |[{"id": "dc073daa-4854-4964-b072-111ec8dde874", "address": "fa:16:3e:aa:e3:e3", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdc073daa-48", "ovs_interfaceid": "dc073daa-4854-4964-b072-111ec8dde874", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 534.642672] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:aa:e3:e3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dc073daa-4854-4964-b072-111ec8dde874', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 534.654183] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Creating folder: Project (c1c52959b8554316975b8a38175325b3). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 534.654881] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7a06cf50-e48f-4435-a3d2-02299de84dc9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.666472] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Created folder: Project (c1c52959b8554316975b8a38175325b3) in parent group-v449747. [ 534.666855] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Creating folder: Instances. Parent ref: group-v449763. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 534.667099] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-caff5b8d-13d6-497e-a43d-b2565963452c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.676600] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Created folder: Instances in parent group-v449763. [ 534.676600] env[60788]: DEBUG oslo.service.loopingcall [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 534.676600] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 534.676600] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a5fbcb99-c6f7-4b81-a4df-b31c3ed3df6e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 534.696209] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 534.696209] env[60788]: value = "task-2205109" [ 534.696209] env[60788]: _type = "Task" [ 534.696209] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 534.706088] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205109, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 534.857616] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 534.857847] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 534.878316] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 534.978797] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 534.982662] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 534.982662] env[60788]: INFO nova.compute.claims [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 535.052042] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Updating instance_info_cache with network_info: [{"id": "98e811dc-7dd0-460f-ae20-e5cdab2bae31", "address": "fa:16:3e:b8:75:16", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.124", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap98e811dc-7d", "ovs_interfaceid": "98e811dc-7dd0-460f-ae20-e5cdab2bae31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 535.074587] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Releasing lock "refresh_cache-fb58b00e-1a78-4750-b912-48c94144ea66" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 535.080126] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance network_info: |[{"id": "98e811dc-7dd0-460f-ae20-e5cdab2bae31", "address": "fa:16:3e:b8:75:16", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.124", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap98e811dc-7d", "ovs_interfaceid": "98e811dc-7dd0-460f-ae20-e5cdab2bae31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 535.080380] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b8:75:16', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '98e811dc-7dd0-460f-ae20-e5cdab2bae31', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 535.088245] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Creating folder: Project (8d0c1e93824f41b7845970fd9b98241c). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 535.093498] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-90c94e36-790c-44f9-9dde-1a422e6402c8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.109440] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Created folder: Project (8d0c1e93824f41b7845970fd9b98241c) in parent group-v449747. [ 535.109707] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Creating folder: Instances. Parent ref: group-v449766. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 535.110342] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bdb903d5-dbf6-40dd-aaf3-628f8ded19d8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.126348] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Created folder: Instances in parent group-v449766. [ 535.126628] env[60788]: DEBUG oslo.service.loopingcall [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 535.126826] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 535.127071] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aa082c91-6651-460f-a3b7-429b65ea0131 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.155931] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 535.155931] env[60788]: value = "task-2205112" [ 535.155931] env[60788]: _type = "Task" [ 535.155931] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 535.170786] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205112, 'name': CreateVM_Task} progress is 6%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 535.211502] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205109, 'name': CreateVM_Task, 'duration_secs': 0.458363} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 535.217045] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 535.217045] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 535.217045] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 535.217045] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 535.217045] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a2481de1-f38b-47a0-94f2-8762d7be7021 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.220764] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Waiting for the task: (returnval){ [ 535.220764] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52754aa5-8420-7efa-33a4-3c6cf0eab88b" [ 535.220764] env[60788]: _type = "Task" [ 535.220764] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 535.235593] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52754aa5-8420-7efa-33a4-3c6cf0eab88b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 535.307099] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f36a1f22-18ac-49f1-bf25-f6e5e406e5a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.318601] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d851fe01-52b1-441f-ab44-5aed4bbe941d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.385616] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e68a4b0-d844-44b5-86d3-6e185d70b92e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.393988] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af2032e0-d5bc-4531-bb3c-f63782febf7c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.414171] env[60788]: DEBUG nova.compute.provider_tree [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 535.432984] env[60788]: DEBUG nova.scheduler.client.report [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 535.455643] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.474s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 535.455738] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 535.472770] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "80e7296f-45ed-4987-9884-05bd883f4144" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 535.472873] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "80e7296f-45ed-4987-9884-05bd883f4144" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 535.492125] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 535.524963] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Successfully created port: 3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 535.534828] env[60788]: DEBUG nova.compute.utils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 535.536217] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 535.536404] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 535.559969] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 535.597021] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 535.597021] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 535.597021] env[60788]: INFO nova.compute.claims [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 535.671348] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205112, 'name': CreateVM_Task, 'duration_secs': 0.371111} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 535.672128] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 535.673147] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 535.676147] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 535.731318] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 535.731508] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 535.732089] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 535.732089] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 535.732089] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 535.732089] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 535.732368] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 535.732529] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 535.732690] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 535.732847] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 535.733026] env[60788]: DEBUG nova.virt.hardware [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 535.734215] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fecb41d9-f921-4aa7-af59-a6f99c6d6f0f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.741638] env[60788]: DEBUG nova.policy [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5781618470a94855a4539ff66776be75', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b67562289dd6472d972bc0ec5c184f32', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 535.752118] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 535.752672] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 535.752672] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 535.753020] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 535.753463] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 535.755332] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c77ca12-989d-4840-a75f-e3c53cbbffb8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.759676] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b630eac2-83b7-437e-b15c-df9ad34185fc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.779029] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Waiting for the task: (returnval){ [ 535.779029] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525db34c-b657-893c-863b-1a577d094b7b" [ 535.779029] env[60788]: _type = "Task" [ 535.779029] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 535.792434] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 535.792434] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 535.792434] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 535.903896] env[60788]: DEBUG nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Received event network-vif-plugged-0760d543-6bb2-4b26-a28e-fc6eb0713565 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 535.905052] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Acquiring lock "f88189b2-070f-4529-af1b-67c8d9b271a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 535.905052] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 535.905518] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 535.906462] env[60788]: DEBUG nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] No waiting events found dispatching network-vif-plugged-0760d543-6bb2-4b26-a28e-fc6eb0713565 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 535.906462] env[60788]: WARNING nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Received unexpected event network-vif-plugged-0760d543-6bb2-4b26-a28e-fc6eb0713565 for instance with vm_state building and task_state spawning. [ 535.906462] env[60788]: DEBUG nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Received event network-changed-db8115b3-50ec-462d-998b-ef0aedd79dc1 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 535.906462] env[60788]: DEBUG nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Refreshing instance network info cache due to event network-changed-db8115b3-50ec-462d-998b-ef0aedd79dc1. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 535.907224] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Acquiring lock "refresh_cache-1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 535.907224] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Acquired lock "refresh_cache-1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 535.907224] env[60788]: DEBUG nova.network.neutron [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Refreshing network info cache for port db8115b3-50ec-462d-998b-ef0aedd79dc1 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 535.932407] env[60788]: DEBUG nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Received event network-changed-f299dfa4-7060-466f-b066-ddb11b0f4faf {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 535.932634] env[60788]: DEBUG nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Refreshing instance network info cache due to event network-changed-f299dfa4-7060-466f-b066-ddb11b0f4faf. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 535.932935] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Acquiring lock "refresh_cache-175d889c-2151-4336-920f-db9a54253946" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 535.936029] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Acquired lock "refresh_cache-175d889c-2151-4336-920f-db9a54253946" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 535.936029] env[60788]: DEBUG nova.network.neutron [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Refreshing network info cache for port f299dfa4-7060-466f-b066-ddb11b0f4faf {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 535.961457] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4db72c32-7b16-41fc-b7b7-7c5915181289 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 535.971013] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-411b7a48-b27e-4a8a-b649-c4c07ba6c683 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.013291] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef528f0d-e29e-48ff-b1c0-f28ec94306ec {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.021181] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1a802e-07d1-4598-9078-5185dfcf3f88 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.038333] env[60788]: DEBUG nova.compute.provider_tree [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 536.050782] env[60788]: DEBUG nova.scheduler.client.report [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 536.076526] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.481s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 536.077048] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 536.135910] env[60788]: DEBUG nova.compute.utils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 536.140945] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 536.144064] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 536.156388] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 536.257438] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 536.286781] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 536.287495] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 536.287495] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 536.287495] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 536.287643] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 536.287768] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 536.287898] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 536.288114] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 536.288254] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 536.288953] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 536.288953] env[60788]: DEBUG nova.virt.hardware [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 536.292158] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4c3f3b3-2d65-4291-90c9-74f2ad01b0d3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.306116] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4a0dbfe-896a-410f-a1c4-ae183c4ce3d1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 536.454992] env[60788]: DEBUG nova.policy [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9608a7d578f54e3aa974e37153821d4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '936e92b1754a415b9b9d7cff62af1e2b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 537.750598] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 537.750960] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 537.969213] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Successfully updated port: 3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 537.985841] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "refresh_cache-331fc548-2076-48e2-a84b-94130a99c2ca" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 537.986248] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquired lock "refresh_cache-331fc548-2076-48e2-a84b-94130a99c2ca" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 537.986248] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 538.090111] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 538.430699] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Updating instance_info_cache with network_info: [{"id": "3dea5ea5-b6cd-4c06-9702-bd29d2f336c3", "address": "fa:16:3e:2b:83:3c", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3dea5ea5-b6", "ovs_interfaceid": "3dea5ea5-b6cd-4c06-9702-bd29d2f336c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 538.449617] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Releasing lock "refresh_cache-331fc548-2076-48e2-a84b-94130a99c2ca" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 538.451123] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance network_info: |[{"id": "3dea5ea5-b6cd-4c06-9702-bd29d2f336c3", "address": "fa:16:3e:2b:83:3c", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3dea5ea5-b6", "ovs_interfaceid": "3dea5ea5-b6cd-4c06-9702-bd29d2f336c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 538.451316] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2b:83:3c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3dea5ea5-b6cd-4c06-9702-bd29d2f336c3', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 538.465779] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Creating folder: Project (97fcd4e838a74b9eb0d979fce6f2839f). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 538.465779] env[60788]: DEBUG nova.network.neutron [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Updated VIF entry in instance network info cache for port db8115b3-50ec-462d-998b-ef0aedd79dc1. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 538.465930] env[60788]: DEBUG nova.network.neutron [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Updating instance_info_cache with network_info: [{"id": "db8115b3-50ec-462d-998b-ef0aedd79dc1", "address": "fa:16:3e:33:6a:63", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.63", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb8115b3-50", "ovs_interfaceid": "db8115b3-50ec-462d-998b-ef0aedd79dc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 538.470021] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3b2ebe23-57e5-480c-ba9e-d0fa0b0dd296 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.471103] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Successfully created port: f2e43fef-6404-43ea-9246-8a35217f7701 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 538.484127] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Created folder: Project (97fcd4e838a74b9eb0d979fce6f2839f) in parent group-v449747. [ 538.484127] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Creating folder: Instances. Parent ref: group-v449769. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 538.484465] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92997e92-2ea3-4aee-b225-062064c18d89 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.488087] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Releasing lock "refresh_cache-1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 538.488755] env[60788]: DEBUG nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Received event network-changed-0760d543-6bb2-4b26-a28e-fc6eb0713565 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 538.489144] env[60788]: DEBUG nova.compute.manager [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Refreshing instance network info cache due to event network-changed-0760d543-6bb2-4b26-a28e-fc6eb0713565. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 538.489510] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Acquiring lock "refresh_cache-f88189b2-070f-4529-af1b-67c8d9b271a8" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 538.489805] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Acquired lock "refresh_cache-f88189b2-070f-4529-af1b-67c8d9b271a8" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 538.491801] env[60788]: DEBUG nova.network.neutron [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Refreshing network info cache for port 0760d543-6bb2-4b26-a28e-fc6eb0713565 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 538.500349] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Created folder: Instances in parent group-v449769. [ 538.500349] env[60788]: DEBUG oslo.service.loopingcall [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 538.500349] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 538.500349] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e075f5bb-9419-46bc-be86-6b000a0e85ad {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 538.527020] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 538.527020] env[60788]: value = "task-2205115" [ 538.527020] env[60788]: _type = "Task" [ 538.527020] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 538.535828] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205115, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 538.539802] env[60788]: DEBUG nova.network.neutron [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Updated VIF entry in instance network info cache for port f299dfa4-7060-466f-b066-ddb11b0f4faf. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 538.540185] env[60788]: DEBUG nova.network.neutron [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: 175d889c-2151-4336-920f-db9a54253946] Updating instance_info_cache with network_info: [{"id": "f299dfa4-7060-466f-b066-ddb11b0f4faf", "address": "fa:16:3e:b5:55:a4", "network": {"id": "3f7667dd-14bb-45f3-8565-c5f043c1dd46", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1035315435-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab4a6671f8254468aefa43fe62bb8ec9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e6db039c-542c-4544-a57d-ddcc6c1e8e45", "external-id": "nsx-vlan-transportzone-810", "segmentation_id": 810, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf299dfa4-70", "ovs_interfaceid": "f299dfa4-7060-466f-b066-ddb11b0f4faf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 538.553083] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Releasing lock "refresh_cache-175d889c-2151-4336-920f-db9a54253946" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 538.553405] env[60788]: DEBUG nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Received event network-vif-plugged-a959fb32-642f-4e1a-a2a2-4585ed732da1 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 538.553807] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Acquiring lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 538.553878] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 538.554055] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 538.554825] env[60788]: DEBUG nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] No waiting events found dispatching network-vif-plugged-a959fb32-642f-4e1a-a2a2-4585ed732da1 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 538.554825] env[60788]: WARNING nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Received unexpected event network-vif-plugged-a959fb32-642f-4e1a-a2a2-4585ed732da1 for instance with vm_state building and task_state spawning. [ 538.554825] env[60788]: DEBUG nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Received event network-changed-a959fb32-642f-4e1a-a2a2-4585ed732da1 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 538.554825] env[60788]: DEBUG nova.compute.manager [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Refreshing instance network info cache due to event network-changed-a959fb32-642f-4e1a-a2a2-4585ed732da1. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 538.555164] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Acquiring lock "refresh_cache-e883e763-d7c1-4eae-af6e-4a4e4a84e323" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 538.555655] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Acquired lock "refresh_cache-e883e763-d7c1-4eae-af6e-4a4e4a84e323" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 538.555655] env[60788]: DEBUG nova.network.neutron [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Refreshing network info cache for port a959fb32-642f-4e1a-a2a2-4585ed732da1 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 539.040432] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205115, 'name': CreateVM_Task, 'duration_secs': 0.359219} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 539.041835] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 539.041835] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 539.041956] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 539.042244] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 539.042506] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f23234b-c5f1-4c01-9caa-c2396bcc2cbd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 539.050975] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Waiting for the task: (returnval){ [ 539.050975] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5263146f-e582-693e-2a8d-7abd57f48704" [ 539.050975] env[60788]: _type = "Task" [ 539.050975] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 539.065459] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5263146f-e582-693e-2a8d-7abd57f48704, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 539.076870] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Successfully created port: 1c4a0b27-f765-48f2-be3e-a678f360f4b9 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 539.563958] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 539.563958] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 539.565019] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 539.977221] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 539.977728] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 540.315520] env[60788]: DEBUG nova.network.neutron [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Updated VIF entry in instance network info cache for port 0760d543-6bb2-4b26-a28e-fc6eb0713565. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 540.315879] env[60788]: DEBUG nova.network.neutron [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Updating instance_info_cache with network_info: [{"id": "0760d543-6bb2-4b26-a28e-fc6eb0713565", "address": "fa:16:3e:04:87:a2", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0760d543-6b", "ovs_interfaceid": "0760d543-6bb2-4b26-a28e-fc6eb0713565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 540.332341] env[60788]: DEBUG oslo_concurrency.lockutils [req-db8b53de-5d0f-4723-bb6d-c79c35b4168d req-0070a472-48d6-4555-b3db-653c4100a82b service nova] Releasing lock "refresh_cache-f88189b2-070f-4529-af1b-67c8d9b271a8" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 540.356114] env[60788]: DEBUG nova.network.neutron [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Updated VIF entry in instance network info cache for port a959fb32-642f-4e1a-a2a2-4585ed732da1. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 540.356480] env[60788]: DEBUG nova.network.neutron [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Updating instance_info_cache with network_info: [{"id": "a959fb32-642f-4e1a-a2a2-4585ed732da1", "address": "fa:16:3e:b3:1c:97", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa959fb32-64", "ovs_interfaceid": "a959fb32-642f-4e1a-a2a2-4585ed732da1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 540.366215] env[60788]: DEBUG oslo_concurrency.lockutils [req-453e3969-0086-4f4a-82af-a91bf273409e req-ef564840-967b-496d-8442-f64e7cf3914d service nova] Releasing lock "refresh_cache-e883e763-d7c1-4eae-af6e-4a4e4a84e323" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 541.538036] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Received event network-vif-plugged-dc073daa-4854-4964-b072-111ec8dde874 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 541.538401] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Acquiring lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 541.538483] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 541.538617] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 541.538780] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] No waiting events found dispatching network-vif-plugged-dc073daa-4854-4964-b072-111ec8dde874 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 541.538965] env[60788]: WARNING nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Received unexpected event network-vif-plugged-dc073daa-4854-4964-b072-111ec8dde874 for instance with vm_state building and task_state spawning. [ 541.542019] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Received event network-vif-plugged-98e811dc-7dd0-460f-ae20-e5cdab2bae31 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 541.542019] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Acquiring lock "fb58b00e-1a78-4750-b912-48c94144ea66-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 541.542019] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Lock "fb58b00e-1a78-4750-b912-48c94144ea66-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 541.542019] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Lock "fb58b00e-1a78-4750-b912-48c94144ea66-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 541.542387] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] No waiting events found dispatching network-vif-plugged-98e811dc-7dd0-460f-ae20-e5cdab2bae31 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 541.542387] env[60788]: WARNING nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Received unexpected event network-vif-plugged-98e811dc-7dd0-460f-ae20-e5cdab2bae31 for instance with vm_state building and task_state spawning. [ 541.542387] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Received event network-changed-dc073daa-4854-4964-b072-111ec8dde874 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 541.542387] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Refreshing instance network info cache due to event network-changed-dc073daa-4854-4964-b072-111ec8dde874. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 541.542387] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Acquiring lock "refresh_cache-231fcc6a-7ec4-4202-b960-ddc966ef2b9c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 541.542532] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Acquired lock "refresh_cache-231fcc6a-7ec4-4202-b960-ddc966ef2b9c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 541.542532] env[60788]: DEBUG nova.network.neutron [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Refreshing network info cache for port dc073daa-4854-4964-b072-111ec8dde874 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 541.561295] env[60788]: DEBUG nova.compute.manager [req-191c5092-91c0-4176-898f-d46e6d830346 req-528bf667-0184-4db8-b650-6f5b7004521e service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Received event network-vif-plugged-3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 541.561570] env[60788]: DEBUG oslo_concurrency.lockutils [req-191c5092-91c0-4176-898f-d46e6d830346 req-528bf667-0184-4db8-b650-6f5b7004521e service nova] Acquiring lock "331fc548-2076-48e2-a84b-94130a99c2ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 541.561699] env[60788]: DEBUG oslo_concurrency.lockutils [req-191c5092-91c0-4176-898f-d46e6d830346 req-528bf667-0184-4db8-b650-6f5b7004521e service nova] Lock "331fc548-2076-48e2-a84b-94130a99c2ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 541.561857] env[60788]: DEBUG oslo_concurrency.lockutils [req-191c5092-91c0-4176-898f-d46e6d830346 req-528bf667-0184-4db8-b650-6f5b7004521e service nova] Lock "331fc548-2076-48e2-a84b-94130a99c2ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 541.562032] env[60788]: DEBUG nova.compute.manager [req-191c5092-91c0-4176-898f-d46e6d830346 req-528bf667-0184-4db8-b650-6f5b7004521e service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] No waiting events found dispatching network-vif-plugged-3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 541.562541] env[60788]: WARNING nova.compute.manager [req-191c5092-91c0-4176-898f-d46e6d830346 req-528bf667-0184-4db8-b650-6f5b7004521e service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Received unexpected event network-vif-plugged-3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 for instance with vm_state building and task_state spawning. [ 541.707279] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Successfully updated port: f2e43fef-6404-43ea-9246-8a35217f7701 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 541.725448] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "refresh_cache-ad93b5d9-8983-4aca-a5ee-3e48f1682122" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 541.727078] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired lock "refresh_cache-ad93b5d9-8983-4aca-a5ee-3e48f1682122" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 541.727861] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 541.889406] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 542.331351] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "0259d811-2677-4164-94cd-5c4f5d935f50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 542.331839] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "0259d811-2677-4164-94cd-5c4f5d935f50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 542.801053] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Successfully updated port: 1c4a0b27-f765-48f2-be3e-a678f360f4b9 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 542.824862] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "refresh_cache-80e7296f-45ed-4987-9884-05bd883f4144" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 542.824862] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "refresh_cache-80e7296f-45ed-4987-9884-05bd883f4144" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 542.824862] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 542.964349] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 542.971280] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Updating instance_info_cache with network_info: [{"id": "f2e43fef-6404-43ea-9246-8a35217f7701", "address": "fa:16:3e:b4:29:ba", "network": {"id": "43370e97-dc16-4d29-8811-2aa769c1960f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543926798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b67562289dd6472d972bc0ec5c184f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "25f42474-5594-4733-a681-6c69f4afb946", "external-id": "nsx-vlan-transportzone-453", "segmentation_id": 453, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf2e43fef-64", "ovs_interfaceid": "f2e43fef-6404-43ea-9246-8a35217f7701", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 542.986310] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Releasing lock "refresh_cache-ad93b5d9-8983-4aca-a5ee-3e48f1682122" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 542.987573] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance network_info: |[{"id": "f2e43fef-6404-43ea-9246-8a35217f7701", "address": "fa:16:3e:b4:29:ba", "network": {"id": "43370e97-dc16-4d29-8811-2aa769c1960f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543926798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b67562289dd6472d972bc0ec5c184f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "25f42474-5594-4733-a681-6c69f4afb946", "external-id": "nsx-vlan-transportzone-453", "segmentation_id": 453, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf2e43fef-64", "ovs_interfaceid": "f2e43fef-6404-43ea-9246-8a35217f7701", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 542.988196] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:29:ba', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '25f42474-5594-4733-a681-6c69f4afb946', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f2e43fef-6404-43ea-9246-8a35217f7701', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 542.999101] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating folder: Project (b67562289dd6472d972bc0ec5c184f32). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 543.000153] env[60788]: DEBUG nova.network.neutron [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Updated VIF entry in instance network info cache for port dc073daa-4854-4964-b072-111ec8dde874. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 543.001247] env[60788]: DEBUG nova.network.neutron [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Updating instance_info_cache with network_info: [{"id": "dc073daa-4854-4964-b072-111ec8dde874", "address": "fa:16:3e:aa:e3:e3", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.167", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdc073daa-48", "ovs_interfaceid": "dc073daa-4854-4964-b072-111ec8dde874", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 543.003312] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e95f951d-6354-4fb5-8afd-04f1353385cd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.017041] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Releasing lock "refresh_cache-231fcc6a-7ec4-4202-b960-ddc966ef2b9c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 543.017656] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Received event network-changed-98e811dc-7dd0-460f-ae20-e5cdab2bae31 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 543.017656] env[60788]: DEBUG nova.compute.manager [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Refreshing instance network info cache due to event network-changed-98e811dc-7dd0-460f-ae20-e5cdab2bae31. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 543.017656] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Acquiring lock "refresh_cache-fb58b00e-1a78-4750-b912-48c94144ea66" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 543.017901] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Acquired lock "refresh_cache-fb58b00e-1a78-4750-b912-48c94144ea66" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 543.017937] env[60788]: DEBUG nova.network.neutron [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Refreshing network info cache for port 98e811dc-7dd0-460f-ae20-e5cdab2bae31 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 543.020669] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Created folder: Project (b67562289dd6472d972bc0ec5c184f32) in parent group-v449747. [ 543.021041] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating folder: Instances. Parent ref: group-v449772. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 543.021287] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2216f66a-1208-40e8-b327-87ffa182fb85 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.036627] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Created folder: Instances in parent group-v449772. [ 543.037014] env[60788]: DEBUG oslo.service.loopingcall [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 543.037101] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 543.037248] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-88bf5124-44ac-4152-a2c1-9755f2993f42 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.060588] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 543.060588] env[60788]: value = "task-2205118" [ 543.060588] env[60788]: _type = "Task" [ 543.060588] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 543.071491] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205118, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 543.578337] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205118, 'name': CreateVM_Task, 'duration_secs': 0.376206} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 543.578337] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 543.578337] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 543.578337] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 543.578337] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 543.578612] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f2b42609-d2cc-4cf7-81a9-56c2b7d2c6c9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.587702] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 543.587702] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52c6b7ab-e790-25b3-345c-33f05b96fd26" [ 543.587702] env[60788]: _type = "Task" [ 543.587702] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 543.598839] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52c6b7ab-e790-25b3-345c-33f05b96fd26, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 543.852455] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Updating instance_info_cache with network_info: [{"id": "1c4a0b27-f765-48f2-be3e-a678f360f4b9", "address": "fa:16:3e:44:9d:68", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c4a0b27-f7", "ovs_interfaceid": "1c4a0b27-f765-48f2-be3e-a678f360f4b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 543.854830] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 543.855358] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 543.868789] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "refresh_cache-80e7296f-45ed-4987-9884-05bd883f4144" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 543.872120] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance network_info: |[{"id": "1c4a0b27-f765-48f2-be3e-a678f360f4b9", "address": "fa:16:3e:44:9d:68", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c4a0b27-f7", "ovs_interfaceid": "1c4a0b27-f765-48f2-be3e-a678f360f4b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 543.872267] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:44:9d:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed3ffc1d-9f86-4029-857e-6cd1d383edbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1c4a0b27-f765-48f2-be3e-a678f360f4b9', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 543.878413] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating folder: Project (936e92b1754a415b9b9d7cff62af1e2b). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 543.879825] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e48ef1ae-f936-43f9-b7e6-3bed97783db1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.889496] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created folder: Project (936e92b1754a415b9b9d7cff62af1e2b) in parent group-v449747. [ 543.889694] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating folder: Instances. Parent ref: group-v449775. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 543.889928] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-babd48a7-1567-4ce7-8160-2a585b4c9f10 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.900180] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created folder: Instances in parent group-v449775. [ 543.900491] env[60788]: DEBUG oslo.service.loopingcall [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 543.900692] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 543.900942] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b38b527f-4e41-4322-9ffc-f0b25580ea31 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 543.927897] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 543.927897] env[60788]: value = "task-2205121" [ 543.927897] env[60788]: _type = "Task" [ 543.927897] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 543.936559] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205121, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 544.104211] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 544.104211] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 544.104211] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 544.151667] env[60788]: DEBUG nova.network.neutron [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Updated VIF entry in instance network info cache for port 98e811dc-7dd0-460f-ae20-e5cdab2bae31. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 544.152591] env[60788]: DEBUG nova.network.neutron [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Updating instance_info_cache with network_info: [{"id": "98e811dc-7dd0-460f-ae20-e5cdab2bae31", "address": "fa:16:3e:b8:75:16", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.124", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap98e811dc-7d", "ovs_interfaceid": "98e811dc-7dd0-460f-ae20-e5cdab2bae31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 544.162289] env[60788]: DEBUG oslo_concurrency.lockutils [req-aa9431c3-f948-43f4-8106-1ff315f76e71 req-574f573c-5300-4a43-8f23-62655211ca8d service nova] Releasing lock "refresh_cache-fb58b00e-1a78-4750-b912-48c94144ea66" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 544.438570] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205121, 'name': CreateVM_Task, 'duration_secs': 0.319078} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 544.438749] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 544.439504] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 544.439716] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 544.440082] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 544.440372] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-84410c5c-e5a6-499d-8562-4218e62ad926 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 544.445229] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 544.445229] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52a0b68f-b8ab-0bac-e1c8-570172d2615e" [ 544.445229] env[60788]: _type = "Task" [ 544.445229] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 544.453374] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52a0b68f-b8ab-0bac-e1c8-570172d2615e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 544.890705] env[60788]: DEBUG nova.compute.manager [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Received event network-changed-3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 544.890984] env[60788]: DEBUG nova.compute.manager [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Refreshing instance network info cache due to event network-changed-3dea5ea5-b6cd-4c06-9702-bd29d2f336c3. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 544.891101] env[60788]: DEBUG oslo_concurrency.lockutils [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] Acquiring lock "refresh_cache-331fc548-2076-48e2-a84b-94130a99c2ca" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 544.891604] env[60788]: DEBUG oslo_concurrency.lockutils [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] Acquired lock "refresh_cache-331fc548-2076-48e2-a84b-94130a99c2ca" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 544.891604] env[60788]: DEBUG nova.network.neutron [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Refreshing network info cache for port 3dea5ea5-b6cd-4c06-9702-bd29d2f336c3 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 544.920093] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Received event network-vif-plugged-f2e43fef-6404-43ea-9246-8a35217f7701 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 544.920459] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Acquiring lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 544.920768] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 544.921212] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 544.921629] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] No waiting events found dispatching network-vif-plugged-f2e43fef-6404-43ea-9246-8a35217f7701 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 544.921850] env[60788]: WARNING nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Received unexpected event network-vif-plugged-f2e43fef-6404-43ea-9246-8a35217f7701 for instance with vm_state building and task_state spawning. [ 544.922145] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Received event network-changed-f2e43fef-6404-43ea-9246-8a35217f7701 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 544.922402] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Refreshing instance network info cache due to event network-changed-f2e43fef-6404-43ea-9246-8a35217f7701. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 544.924033] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Acquiring lock "refresh_cache-ad93b5d9-8983-4aca-a5ee-3e48f1682122" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 544.924033] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Acquired lock "refresh_cache-ad93b5d9-8983-4aca-a5ee-3e48f1682122" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 544.924033] env[60788]: DEBUG nova.network.neutron [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Refreshing network info cache for port f2e43fef-6404-43ea-9246-8a35217f7701 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 544.960365] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 544.960365] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 544.960365] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.729630] env[60788]: DEBUG nova.network.neutron [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Updated VIF entry in instance network info cache for port 3dea5ea5-b6cd-4c06-9702-bd29d2f336c3. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 545.730030] env[60788]: DEBUG nova.network.neutron [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Updating instance_info_cache with network_info: [{"id": "3dea5ea5-b6cd-4c06-9702-bd29d2f336c3", "address": "fa:16:3e:2b:83:3c", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3dea5ea5-b6", "ovs_interfaceid": "3dea5ea5-b6cd-4c06-9702-bd29d2f336c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 545.741291] env[60788]: DEBUG oslo_concurrency.lockutils [req-d4e24498-bd8b-4348-807f-f0779933cc53 req-a2b71294-f4cf-4b97-9d6f-a6fc6ca79a70 service nova] Releasing lock "refresh_cache-331fc548-2076-48e2-a84b-94130a99c2ca" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 545.768955] env[60788]: DEBUG nova.network.neutron [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Updated VIF entry in instance network info cache for port f2e43fef-6404-43ea-9246-8a35217f7701. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 545.768955] env[60788]: DEBUG nova.network.neutron [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Updating instance_info_cache with network_info: [{"id": "f2e43fef-6404-43ea-9246-8a35217f7701", "address": "fa:16:3e:b4:29:ba", "network": {"id": "43370e97-dc16-4d29-8811-2aa769c1960f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543926798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b67562289dd6472d972bc0ec5c184f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "25f42474-5594-4733-a681-6c69f4afb946", "external-id": "nsx-vlan-transportzone-453", "segmentation_id": 453, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf2e43fef-64", "ovs_interfaceid": "f2e43fef-6404-43ea-9246-8a35217f7701", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 545.788446] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Releasing lock "refresh_cache-ad93b5d9-8983-4aca-a5ee-3e48f1682122" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 545.788446] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Received event network-vif-plugged-1c4a0b27-f765-48f2-be3e-a678f360f4b9 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 545.788446] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Acquiring lock "80e7296f-45ed-4987-9884-05bd883f4144-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 545.788818] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Lock "80e7296f-45ed-4987-9884-05bd883f4144-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 545.788818] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Lock "80e7296f-45ed-4987-9884-05bd883f4144-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 545.788938] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] No waiting events found dispatching network-vif-plugged-1c4a0b27-f765-48f2-be3e-a678f360f4b9 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 545.789037] env[60788]: WARNING nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Received unexpected event network-vif-plugged-1c4a0b27-f765-48f2-be3e-a678f360f4b9 for instance with vm_state building and task_state spawning. [ 545.789196] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Received event network-changed-1c4a0b27-f765-48f2-be3e-a678f360f4b9 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 545.789366] env[60788]: DEBUG nova.compute.manager [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Refreshing instance network info cache due to event network-changed-1c4a0b27-f765-48f2-be3e-a678f360f4b9. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 545.789774] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Acquiring lock "refresh_cache-80e7296f-45ed-4987-9884-05bd883f4144" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 545.789774] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Acquired lock "refresh_cache-80e7296f-45ed-4987-9884-05bd883f4144" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 545.789944] env[60788]: DEBUG nova.network.neutron [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Refreshing network info cache for port 1c4a0b27-f765-48f2-be3e-a678f360f4b9 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 546.818265] env[60788]: DEBUG nova.network.neutron [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Updated VIF entry in instance network info cache for port 1c4a0b27-f765-48f2-be3e-a678f360f4b9. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 546.819019] env[60788]: DEBUG nova.network.neutron [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Updating instance_info_cache with network_info: [{"id": "1c4a0b27-f765-48f2-be3e-a678f360f4b9", "address": "fa:16:3e:44:9d:68", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c4a0b27-f7", "ovs_interfaceid": "1c4a0b27-f765-48f2-be3e-a678f360f4b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 546.831617] env[60788]: DEBUG oslo_concurrency.lockutils [req-d6fcdcea-ef18-4dc2-9f97-3fb74a5a0a01 req-afc1d437-6d59-4eee-9750-4bdd1002a9cd service nova] Releasing lock "refresh_cache-80e7296f-45ed-4987-9884-05bd883f4144" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 546.862727] env[60788]: DEBUG oslo_concurrency.lockutils [None req-97ca0ff1-250a-439a-b9df-1b839dfc77f6 tempest-ServerMetadataTestJSON-2090774188 tempest-ServerMetadataTestJSON-2090774188-project-member] Acquiring lock "c7380613-f621-4c56-9e8f-6b4f8dfe3ef1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 546.862962] env[60788]: DEBUG oslo_concurrency.lockutils [None req-97ca0ff1-250a-439a-b9df-1b839dfc77f6 tempest-ServerMetadataTestJSON-2090774188 tempest-ServerMetadataTestJSON-2090774188-project-member] Lock "c7380613-f621-4c56-9e8f-6b4f8dfe3ef1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 547.454366] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5a034956-c054-4769-b4b2-ad45afad48e3 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Acquiring lock "dc9706cc-1c7e-4570-8607-20120306153c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 547.454999] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5a034956-c054-4769-b4b2-ad45afad48e3 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Lock "dc9706cc-1c7e-4570-8607-20120306153c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 547.906235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-31e5baab-94ed-4e4c-a248-2649db17ae79 tempest-ServersNegativeTestJSON-1857535569 tempest-ServersNegativeTestJSON-1857535569-project-member] Acquiring lock "9aec9e50-0470-43f7-98b2-3f2eac50e6bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 547.906755] env[60788]: DEBUG oslo_concurrency.lockutils [None req-31e5baab-94ed-4e4c-a248-2649db17ae79 tempest-ServersNegativeTestJSON-1857535569 tempest-ServersNegativeTestJSON-1857535569-project-member] Lock "9aec9e50-0470-43f7-98b2-3f2eac50e6bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 549.268515] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d180e74c-e402-4d07-8908-b0efc419db27 tempest-VolumesAssistedSnapshotsTest-1856435780 tempest-VolumesAssistedSnapshotsTest-1856435780-project-member] Acquiring lock "929a6bc2-109d-4753-8b98-8155c0e4e839" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 549.268867] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d180e74c-e402-4d07-8908-b0efc419db27 tempest-VolumesAssistedSnapshotsTest-1856435780 tempest-VolumesAssistedSnapshotsTest-1856435780-project-member] Lock "929a6bc2-109d-4753-8b98-8155c0e4e839" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 552.637192] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37167931-5b27-4a56-99c9-ba329e7688d4 tempest-ServersWithSpecificFlavorTestJSON-1896831385 tempest-ServersWithSpecificFlavorTestJSON-1896831385-project-member] Acquiring lock "ee5957af-d4c1-4c71-82ba-83c06eb08869" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 552.638065] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37167931-5b27-4a56-99c9-ba329e7688d4 tempest-ServersWithSpecificFlavorTestJSON-1896831385 tempest-ServersWithSpecificFlavorTestJSON-1896831385-project-member] Lock "ee5957af-d4c1-4c71-82ba-83c06eb08869" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 558.976361] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fafb2cac-a1c8-4ff5-94cd-15183f4ef5ac tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] Acquiring lock "951453a7-a034-4111-9c5c-71d5c25245ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 558.976688] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fafb2cac-a1c8-4ff5-94cd-15183f4ef5ac tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] Lock "951453a7-a034-4111-9c5c-71d5c25245ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.984334] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7d7e54e7-25eb-4724-9c63-dd7ca41cff5b tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Acquiring lock "c27066c4-1fbb-4918-94c4-62a8bf1c2dda" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.985559] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7d7e54e7-25eb-4724-9c63-dd7ca41cff5b tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Lock "c27066c4-1fbb-4918-94c4-62a8bf1c2dda" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 564.650055] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1b3f4b47-9993-4a1c-808c-7df0b4bca4b0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Acquiring lock "a92343b0-aad3-4416-9305-1432b35ae1a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 564.650382] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1b3f4b47-9993-4a1c-808c-7df0b4bca4b0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Lock "a92343b0-aad3-4416-9305-1432b35ae1a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 564.706201] env[60788]: DEBUG oslo_concurrency.lockutils [None req-2bbfd857-58c7-4740-9290-f170bf8ad744 tempest-ServersTestFqdnHostnames-2032606859 tempest-ServersTestFqdnHostnames-2032606859-project-member] Acquiring lock "cd2ac191-ea52-43cd-a20b-87b963112818" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 564.706829] env[60788]: DEBUG oslo_concurrency.lockutils [None req-2bbfd857-58c7-4740-9290-f170bf8ad744 tempest-ServersTestFqdnHostnames-2032606859 tempest-ServersTestFqdnHostnames-2032606859-project-member] Lock "cd2ac191-ea52-43cd-a20b-87b963112818" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 565.774109] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d3e697bf-90c1-4b85-acd2-e8578cf21fca tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] Acquiring lock "008c517b-5838-43a0-aad3-5c7436d00275" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 565.774527] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d3e697bf-90c1-4b85-acd2-e8578cf21fca tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] Lock "008c517b-5838-43a0-aad3-5c7436d00275" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 566.673158] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8d9448e2-31bd-45f7-8631-b52590c958f0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Acquiring lock "e4bfd8ff-c503-420a-8a89-34c652d9fb2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.673676] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8d9448e2-31bd-45f7-8631-b52590c958f0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Lock "e4bfd8ff-c503-420a-8a89-34c652d9fb2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 566.679066] env[60788]: DEBUG oslo_concurrency.lockutils [None req-97c5b76c-1771-41ce-b915-246f2c2379e4 tempest-ServersTestJSON-162999821 tempest-ServersTestJSON-162999821-project-member] Acquiring lock "1e6f4b96-7207-4208-9193-b0d207b1c703" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 566.679284] env[60788]: DEBUG oslo_concurrency.lockutils [None req-97c5b76c-1771-41ce-b915-246f2c2379e4 tempest-ServersTestJSON-162999821 tempest-ServersTestJSON-162999821-project-member] Lock "1e6f4b96-7207-4208-9193-b0d207b1c703" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 569.856637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a3ae2cfa-bddb-4d68-b3c4-cf196a6e2518 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] Acquiring lock "5f6d54d4-6862-46b0-9558-5db0c5b392d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 569.856932] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a3ae2cfa-bddb-4d68-b3c4-cf196a6e2518 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] Lock "5f6d54d4-6862-46b0-9558-5db0c5b392d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.743568] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0bcc92c4-e4e6-47e7-a654-c7da1bc77663 tempest-InstanceActionsV221TestJSON-1379281465 tempest-InstanceActionsV221TestJSON-1379281465-project-member] Acquiring lock "977ea808-4e2d-4388-a5af-93048b5754e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.743805] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0bcc92c4-e4e6-47e7-a654-c7da1bc77663 tempest-InstanceActionsV221TestJSON-1379281465 tempest-InstanceActionsV221TestJSON-1379281465-project-member] Lock "977ea808-4e2d-4388-a5af-93048b5754e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 573.696709] env[60788]: WARNING oslo_vmware.rw_handles [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 573.696709] env[60788]: ERROR oslo_vmware.rw_handles [ 573.697638] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 573.701903] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 573.701903] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Copying Virtual Disk [datastore2] vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/4f43ddb3-2daa-4b84-9a91-05b7300e6640/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 573.701903] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-01dda0e1-1c2b-493a-858c-7e0ff25d1547 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 573.713013] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Waiting for the task: (returnval){ [ 573.713013] env[60788]: value = "task-2205122" [ 573.713013] env[60788]: _type = "Task" [ 573.713013] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 573.724023] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Task: {'id': task-2205122, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 574.234453] env[60788]: DEBUG oslo_vmware.exceptions [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 574.234757] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 574.237738] env[60788]: ERROR nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 574.237738] env[60788]: Faults: ['InvalidArgument'] [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Traceback (most recent call last): [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] yield resources [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self.driver.spawn(context, instance, image_meta, [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self._fetch_image_if_missing(context, vi) [ 574.237738] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] image_cache(vi, tmp_image_ds_loc) [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] vm_util.copy_virtual_disk( [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] session._wait_for_task(vmdk_copy_task) [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] return self.wait_for_task(task_ref) [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] return evt.wait() [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] result = hub.switch() [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 574.238196] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] return self.greenlet.switch() [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self.f(*self.args, **self.kw) [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] raise exceptions.translate_fault(task_info.error) [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Faults: ['InvalidArgument'] [ 574.238578] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] [ 574.242090] env[60788]: INFO nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Terminating instance [ 574.243881] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 574.244098] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 574.244352] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8c8253a3-2f49-45dd-a932-07b41c51a0a5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.247494] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "refresh_cache-1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 574.247658] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquired lock "refresh_cache-1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 574.247822] env[60788]: DEBUG nova.network.neutron [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 574.255813] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 574.255813] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 574.256821] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e243c375-ffd1-4fd7-a245-2f54b304d987 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.267217] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Waiting for the task: (returnval){ [ 574.267217] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52598024-9b76-133b-d172-8fcafc3825e6" [ 574.267217] env[60788]: _type = "Task" [ 574.267217] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.280366] env[60788]: DEBUG nova.network.neutron [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 574.282253] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 574.282474] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Creating directory with path [datastore2] vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 574.283021] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-31d47269-522b-4899-bc78-5f6c7ad56329 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.306835] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Created directory with path [datastore2] vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 574.307052] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Fetch image to [datastore2] vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 574.307219] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 574.308053] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-748ba2ba-d11c-494d-ba8e-d838ea649797 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.320266] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf03e0c7-ce21-4070-b773-0f796cd3d0f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.334225] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f4b851-59d9-4377-bc8f-666127ec5ada {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.377200] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49f95374-98be-4f4d-990e-87986dd6d6c0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.383684] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-37eafcd7-10f3-4223-928e-65cc9a17ceb8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.416252] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 574.503167] env[60788]: DEBUG oslo_vmware.rw_handles [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 574.570956] env[60788]: DEBUG oslo_vmware.rw_handles [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 574.571192] env[60788]: DEBUG oslo_vmware.rw_handles [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 574.663189] env[60788]: DEBUG nova.network.neutron [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 574.678871] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Releasing lock "refresh_cache-1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 574.678871] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 574.678871] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 574.678871] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d9e373-4440-4eb9-a304-96a54391edce {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.692657] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 574.692946] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-52f6b911-d750-4576-987a-3f79c1d7ee55 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.726915] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 574.727285] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 574.727439] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Deleting the datastore file [datastore2] 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 574.727708] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ca2074c8-db5c-451c-a834-f71baceed144 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 574.741502] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Waiting for the task: (returnval){ [ 574.741502] env[60788]: value = "task-2205124" [ 574.741502] env[60788]: _type = "Task" [ 574.741502] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 574.752285] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Task: {'id': task-2205124, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 575.253627] env[60788]: DEBUG oslo_vmware.api [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Task: {'id': task-2205124, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.037696} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 575.253893] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 575.255248] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 575.255248] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 575.255547] env[60788]: INFO nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Took 0.58 seconds to destroy the instance on the hypervisor. [ 575.255795] env[60788]: DEBUG oslo.service.loopingcall [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 575.258384] env[60788]: DEBUG nova.compute.manager [-] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 575.261737] env[60788]: DEBUG nova.compute.claims [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 575.261896] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.262157] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.862257] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80cb0276-a882-4a7c-a95c-4ffe94f2a484 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.871214] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-492a71ba-cb57-4322-b3b3-5a32c1292bb6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.902982] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a36dbb4-3e12-4182-bbde-4e3422ffe16c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.910449] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-904130e0-39fe-4483-b520-e7160cebad04 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.924814] env[60788]: DEBUG nova.compute.provider_tree [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.941421] env[60788]: DEBUG nova.scheduler.client.report [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.959492] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.695s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.959492] env[60788]: ERROR nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 575.959492] env[60788]: Faults: ['InvalidArgument'] [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Traceback (most recent call last): [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self.driver.spawn(context, instance, image_meta, [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 575.959492] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self._fetch_image_if_missing(context, vi) [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] image_cache(vi, tmp_image_ds_loc) [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] vm_util.copy_virtual_disk( [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] session._wait_for_task(vmdk_copy_task) [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] return self.wait_for_task(task_ref) [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] return evt.wait() [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] result = hub.switch() [ 575.960040] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] return self.greenlet.switch() [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] self.f(*self.args, **self.kw) [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] raise exceptions.translate_fault(task_info.error) [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Faults: ['InvalidArgument'] [ 575.960441] env[60788]: ERROR nova.compute.manager [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] [ 575.960441] env[60788]: DEBUG nova.compute.utils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 575.963179] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Build of instance 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f was re-scheduled: A specified parameter was not correct: fileType [ 575.963179] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 575.963309] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 575.963915] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquiring lock "refresh_cache-1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 575.963915] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Acquired lock "refresh_cache-1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 575.963915] env[60788]: DEBUG nova.network.neutron [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 576.006912] env[60788]: DEBUG nova.network.neutron [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 576.162590] env[60788]: DEBUG nova.network.neutron [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 576.181022] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Releasing lock "refresh_cache-1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 576.181022] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 576.181022] env[60788]: DEBUG nova.compute.manager [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] [instance: 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 576.318870] env[60788]: INFO nova.scheduler.client.report [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Deleted allocations for instance 1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f [ 576.354329] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1163dbc5-288e-416f-bf0c-ecb9f91203c5 tempest-ServersAdmin275Test-574239960 tempest-ServersAdmin275Test-574239960-project-member] Lock "1fd4bb19-8ac0-46c0-9181-2c1ab6128c8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 55.962s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 576.383067] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 576.463637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 576.464040] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 576.465925] env[60788]: INFO nova.compute.claims [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 577.006429] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd508534-4d33-4e8e-9a89-129d72e6d4ee {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.015026] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7c9b0fb-3ea9-415f-aab3-532f1b0edb63 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.050176] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be111b2c-406b-4bb3-9037-92b615e3dc17 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.059342] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a572807b-5198-4146-a712-ac5c8a618c8a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.075767] env[60788]: DEBUG nova.compute.provider_tree [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 577.087147] env[60788]: DEBUG nova.scheduler.client.report [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 577.107785] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.644s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 577.108502] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 577.155811] env[60788]: DEBUG nova.compute.utils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 577.157325] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Not allocating networking since 'none' was specified. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 577.174321] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 577.272894] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 577.315015] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "01821598-4692-440b-8128-c50e359386e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.315015] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "01821598-4692-440b-8128-c50e359386e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.319240] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 577.319741] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 577.319905] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 577.320853] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 577.321049] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 577.321302] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 577.321537] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 577.322404] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 577.322404] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 577.322404] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 577.322404] env[60788]: DEBUG nova.virt.hardware [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 577.323612] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cdae849-49b9-4bd6-9efc-315ed44221d0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.335558] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f25d61d-42b6-4caf-835e-65ef911f5906 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.351186] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance VIF info [] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 577.359602] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Creating folder: Project (9876a25333d1452b804060892dea744b). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 577.359602] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-75279165-2ba7-41f1-af59-0999bf8ba25f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.371821] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "6f437d9f-f904-46ab-9dc6-9902c2ea4c71" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 577.371821] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "6f437d9f-f904-46ab-9dc6-9902c2ea4c71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 577.372388] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Created folder: Project (9876a25333d1452b804060892dea744b) in parent group-v449747. [ 577.372388] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Creating folder: Instances. Parent ref: group-v449778. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 577.372513] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf3336d3-abd2-4f32-9b28-366aa0ee5041 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.382535] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Created folder: Instances in parent group-v449778. [ 577.383016] env[60788]: DEBUG oslo.service.loopingcall [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 577.383160] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 577.383470] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-57749c78-422c-49b8-bc0a-bc7bdc452288 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.403695] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 577.403695] env[60788]: value = "task-2205127" [ 577.403695] env[60788]: _type = "Task" [ 577.403695] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 577.414315] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205127, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 577.914525] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205127, 'name': CreateVM_Task, 'duration_secs': 0.278829} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 577.914944] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 577.915538] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 577.915873] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 577.918092] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 577.918092] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4d9b72a1-5aa4-4811-a70e-bc9f1412c9f2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 577.921544] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for the task: (returnval){ [ 577.921544] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52952a79-e6f0-dc80-c7ce-ca7785f4b345" [ 577.921544] env[60788]: _type = "Task" [ 577.921544] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 577.932054] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52952a79-e6f0-dc80-c7ce-ca7785f4b345, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 578.435878] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 578.436268] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 578.436511] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 583.995973] env[60788]: DEBUG oslo_concurrency.lockutils [None req-db9e33ef-b90c-4212-88a2-d697728b61e6 tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] Acquiring lock "ef9990e1-e0a7-41c0-b738-de213fd7046a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 583.996465] env[60788]: DEBUG oslo_concurrency.lockutils [None req-db9e33ef-b90c-4212-88a2-d697728b61e6 tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] Lock "ef9990e1-e0a7-41c0-b738-de213fd7046a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.968970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c9d3fce2-9ee6-4e64-9b70-e3d9c0afcea4 tempest-FloatingIPsAssociationTestJSON-421040393 tempest-FloatingIPsAssociationTestJSON-421040393-project-member] Acquiring lock "d159fbfa-f391-41f8-97ba-eb145eed26e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.968970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c9d3fce2-9ee6-4e64-9b70-e3d9c0afcea4 tempest-FloatingIPsAssociationTestJSON-421040393 tempest-FloatingIPsAssociationTestJSON-421040393-project-member] Lock "d159fbfa-f391-41f8-97ba-eb145eed26e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 585.968970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3d0c9844-ecd7-470c-91a3-7db11090c13a tempest-ServerGroupTestJSON-1121452212 tempest-ServerGroupTestJSON-1121452212-project-member] Acquiring lock "064e7e7c-eeca-4822-9d5a-148b9fbdc1f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 585.969332] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3d0c9844-ecd7-470c-91a3-7db11090c13a tempest-ServerGroupTestJSON-1121452212 tempest-ServerGroupTestJSON-1121452212-project-member] Lock "064e7e7c-eeca-4822-9d5a-148b9fbdc1f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.352486] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.382823] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.382823] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 589.382823] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 589.402788] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.402947] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403094] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 175d889c-2151-4336-920f-db9a54253946] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403227] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403354] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403551] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403694] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403831] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.403960] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.404385] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 589.404385] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 589.404666] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.404887] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.405035] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 589.753992] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.754251] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.754420] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 589.767472] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.767472] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.767472] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 589.767472] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 589.767884] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d51e95f6-fd4d-43b0-9788-e76feaf4025e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.777098] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0130175d-639b-4b34-89e6-d53dd4c2af08 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.792551] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cd8dbf6-1e62-4fc2-9b48-4e423d3718f5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.799550] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a2a5e15-2238-4994-8797-9062f21a28d0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 589.831098] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181153MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 589.831267] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.831460] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.903097] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f88189b2-070f-4529-af1b-67c8d9b271a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903265] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903712] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 175d889c-2151-4336-920f-db9a54253946 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903712] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903712] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903866] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb58b00e-1a78-4750-b912-48c94144ea66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903866] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 331fc548-2076-48e2-a84b-94130a99c2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.903996] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.904077] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.904191] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 589.933141] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 589.959025] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 589.970410] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 589.981372] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c7380613-f621-4c56-9e8f-6b4f8dfe3ef1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 589.992184] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dc9706cc-1c7e-4570-8607-20120306153c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.003313] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 9aec9e50-0470-43f7-98b2-3f2eac50e6bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.015451] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 929a6bc2-109d-4753-8b98-8155c0e4e839 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.025717] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ee5957af-d4c1-4c71-82ba-83c06eb08869 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.037757] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 951453a7-a034-4111-9c5c-71d5c25245ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.049608] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c27066c4-1fbb-4918-94c4-62a8bf1c2dda has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.068995] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a92343b0-aad3-4416-9305-1432b35ae1a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.095679] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance cd2ac191-ea52-43cd-a20b-87b963112818 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.109189] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 008c517b-5838-43a0-aad3-5c7436d00275 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.124876] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1e6f4b96-7207-4208-9193-b0d207b1c703 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.135915] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e4bfd8ff-c503-420a-8a89-34c652d9fb2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.145897] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5f6d54d4-6862-46b0-9558-5db0c5b392d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.159608] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 977ea808-4e2d-4388-a5af-93048b5754e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.171435] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.183493] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6f437d9f-f904-46ab-9dc6-9902c2ea4c71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.198174] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ef9990e1-e0a7-41c0-b738-de213fd7046a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.208818] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d159fbfa-f391-41f8-97ba-eb145eed26e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.219352] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 064e7e7c-eeca-4822-9d5a-148b9fbdc1f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 590.219936] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 590.220876] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 590.645926] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adf6415d-d991-4e8c-b487-8902e4273e28 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.654129] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c7a43db-7c93-4c17-82e0-f963421391a2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.686996] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bbe2fd3-b286-4c34-a733-f037d15a9bb6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.694639] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c312050-5480-4acd-902e-9d52d81586d3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 590.708249] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 590.716458] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 590.729493] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 590.729675] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.898s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 591.725108] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 591.725397] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 591.725542] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 599.684828] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8579c047-2ba7-460f-8a0e-e547b1b073fa tempest-AttachInterfacesV270Test-1164081489 tempest-AttachInterfacesV270Test-1164081489-project-member] Acquiring lock "4e86b919-f5f8-458c-a588-bd08bdcccf3b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.685482] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8579c047-2ba7-460f-8a0e-e547b1b073fa tempest-AttachInterfacesV270Test-1164081489 tempest-AttachInterfacesV270Test-1164081489-project-member] Lock "4e86b919-f5f8-458c-a588-bd08bdcccf3b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 608.681065] env[60788]: DEBUG oslo_concurrency.lockutils [None req-125c2602-d133-402f-a81c-ca494c37f0b9 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "e991879c-de94-4e14-9480-95c95bcaaa05" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.681339] env[60788]: DEBUG oslo_concurrency.lockutils [None req-125c2602-d133-402f-a81c-ca494c37f0b9 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e991879c-de94-4e14-9480-95c95bcaaa05" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 622.185774] env[60788]: WARNING oslo_vmware.rw_handles [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 622.185774] env[60788]: ERROR oslo_vmware.rw_handles [ 622.186359] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 622.188008] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 622.188299] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Copying Virtual Disk [datastore2] vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/8704ddf5-5bf9-4a88-ad8b-ccb90b5d0a0f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 622.190034] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-61dfc2ca-90de-47c8-80b4-e4f17f19376f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.197080] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Waiting for the task: (returnval){ [ 622.197080] env[60788]: value = "task-2205128" [ 622.197080] env[60788]: _type = "Task" [ 622.197080] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 622.206555] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Task: {'id': task-2205128, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 622.707694] env[60788]: DEBUG oslo_vmware.exceptions [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 622.707988] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 622.708564] env[60788]: ERROR nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 622.708564] env[60788]: Faults: ['InvalidArgument'] [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] Traceback (most recent call last): [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] yield resources [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self.driver.spawn(context, instance, image_meta, [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self._vmops.spawn(context, instance, image_meta, injected_files, [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self._fetch_image_if_missing(context, vi) [ 622.708564] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] image_cache(vi, tmp_image_ds_loc) [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] vm_util.copy_virtual_disk( [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] session._wait_for_task(vmdk_copy_task) [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] return self.wait_for_task(task_ref) [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] return evt.wait() [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] result = hub.switch() [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 622.708893] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] return self.greenlet.switch() [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self.f(*self.args, **self.kw) [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] raise exceptions.translate_fault(task_info.error) [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] Faults: ['InvalidArgument'] [ 622.709280] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] [ 622.709280] env[60788]: INFO nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Terminating instance [ 622.710480] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 622.710685] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 622.710922] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-959334eb-1012-4e99-bab1-5cbdfbf2a56f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.713095] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 622.713288] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 622.714013] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7facaa20-4325-40cd-b0d6-fc83ed5eebba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.720967] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 622.721209] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a1f7a157-641e-4eeb-b2c7-e4234d1b2e77 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.723364] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 622.723536] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 622.724484] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-112a32c1-cbab-4940-ba21-e2c6c1ad61cc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.729099] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Waiting for the task: (returnval){ [ 622.729099] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5268ce64-9aa5-1581-2bdc-a3d347194974" [ 622.729099] env[60788]: _type = "Task" [ 622.729099] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 622.736162] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5268ce64-9aa5-1581-2bdc-a3d347194974, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 622.797719] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 622.798029] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 622.798273] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Deleting the datastore file [datastore2] 175d889c-2151-4336-920f-db9a54253946 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 622.798592] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-108c7dc2-df9f-4263-a349-9925389caf8b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.804734] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Waiting for the task: (returnval){ [ 622.804734] env[60788]: value = "task-2205130" [ 622.804734] env[60788]: _type = "Task" [ 622.804734] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 622.812474] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Task: {'id': task-2205130, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 623.239868] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 623.240139] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Creating directory with path [datastore2] vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 623.240379] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-240ceaac-233c-4a85-86ca-4ffc8e77b903 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.252019] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Created directory with path [datastore2] vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 623.252233] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Fetch image to [datastore2] vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 623.252406] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 623.253176] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-103e9bd4-b649-4b8d-8124-d737222a2eb2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.259675] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a25beba8-d06f-4c80-92af-b8e74e9b653a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.268642] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-331c7d99-8f20-4651-bc91-3714c3482f26 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.300789] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-985ea8f3-f281-4013-a318-49f31553f716 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.309532] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bd6b43a9-a7ad-4f3d-9400-3b275b360acc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.315797] env[60788]: DEBUG oslo_vmware.api [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Task: {'id': task-2205130, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067206} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 623.316062] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 623.316258] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 623.316429] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 623.316605] env[60788]: INFO nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Took 0.60 seconds to destroy the instance on the hypervisor. [ 623.318642] env[60788]: DEBUG nova.compute.claims [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 623.318842] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 623.319106] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 623.331167] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 623.391534] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 623.453647] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 623.453850] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 623.809864] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-062934b3-22c9-4a0c-a386-70a3ab1ccfc5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.817537] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db5daf62-e8f4-4fad-b09d-1c575209ad10 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.847181] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f0eaae0-37a9-48c2-bbf9-1b51854ab143 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.854951] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edb5266d-4f02-45cc-8f41-753ae5dce5ff {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 623.868126] env[60788]: DEBUG nova.compute.provider_tree [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 623.891453] env[60788]: DEBUG nova.scheduler.client.report [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 623.906976] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.588s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 623.907569] env[60788]: ERROR nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 623.907569] env[60788]: Faults: ['InvalidArgument'] [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] Traceback (most recent call last): [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self.driver.spawn(context, instance, image_meta, [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self._vmops.spawn(context, instance, image_meta, injected_files, [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self._fetch_image_if_missing(context, vi) [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] image_cache(vi, tmp_image_ds_loc) [ 623.907569] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] vm_util.copy_virtual_disk( [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] session._wait_for_task(vmdk_copy_task) [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] return self.wait_for_task(task_ref) [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] return evt.wait() [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] result = hub.switch() [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] return self.greenlet.switch() [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 623.907858] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] self.f(*self.args, **self.kw) [ 623.908159] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 623.908159] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] raise exceptions.translate_fault(task_info.error) [ 623.908159] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 623.908159] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] Faults: ['InvalidArgument'] [ 623.908159] env[60788]: ERROR nova.compute.manager [instance: 175d889c-2151-4336-920f-db9a54253946] [ 623.908327] env[60788]: DEBUG nova.compute.utils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 623.909823] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Build of instance 175d889c-2151-4336-920f-db9a54253946 was re-scheduled: A specified parameter was not correct: fileType [ 623.909823] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 623.910256] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 623.910436] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 623.910606] env[60788]: DEBUG nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 623.910771] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 624.636421] env[60788]: DEBUG nova.network.neutron [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 624.653183] env[60788]: INFO nova.compute.manager [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] [instance: 175d889c-2151-4336-920f-db9a54253946] Took 0.74 seconds to deallocate network for instance. [ 624.765672] env[60788]: INFO nova.scheduler.client.report [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Deleted allocations for instance 175d889c-2151-4336-920f-db9a54253946 [ 624.786265] env[60788]: DEBUG oslo_concurrency.lockutils [None req-490ffff6-318d-440c-8be7-7e1ded6ba6c9 tempest-ImagesOneServerTestJSON-1627704671 tempest-ImagesOneServerTestJSON-1627704671-project-member] Lock "175d889c-2151-4336-920f-db9a54253946" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 103.161s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 624.824486] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 624.881058] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 624.881311] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 624.883305] env[60788]: INFO nova.compute.claims [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 625.317914] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f626829-b29e-4d27-9123-f19ad8ecc239 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.325466] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-708121c8-a249-4cfa-82d9-3b6e7408b82f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.357075] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ab5c4d1-0fc2-4b68-aa67-6ac2274d22a5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.365452] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-509015e3-58c8-4ad6-b007-655c7e1f1e18 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.380158] env[60788]: DEBUG nova.compute.provider_tree [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.389803] env[60788]: DEBUG nova.scheduler.client.report [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.403073] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.522s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 625.404954] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 625.441227] env[60788]: DEBUG nova.compute.utils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 625.442514] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 625.444021] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 625.452781] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 625.519159] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 625.537861] env[60788]: DEBUG nova.policy [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5781618470a94855a4539ff66776be75', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b67562289dd6472d972bc0ec5c184f32', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 625.547577] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 625.547877] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 625.548113] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 625.548348] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 625.548530] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 625.548712] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 625.549027] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 625.549238] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 625.549461] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 625.549661] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 625.549927] env[60788]: DEBUG nova.virt.hardware [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 625.550881] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfffef66-61dd-4a3e-b660-3e20c9e09b8b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 625.561222] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f29c0e40-ac16-4516-a9d3-09b240df17e7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 626.049402] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Successfully created port: 151db35a-ec2e-4311-91dc-8921ddacc8b0 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 627.333119] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Successfully updated port: 151db35a-ec2e-4311-91dc-8921ddacc8b0 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 627.354588] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "refresh_cache-aa3bf189-1b7a-40eb-a270-711920dd84a6" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.354751] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired lock "refresh_cache-aa3bf189-1b7a-40eb-a270-711920dd84a6" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 627.354906] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.426399] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.613315] env[60788]: DEBUG nova.compute.manager [req-b2c32e01-4efc-488a-a3f1-f1b7273a38a9 req-7c062444-a8a3-496f-b1ae-1a743325db87 service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Received event network-vif-plugged-151db35a-ec2e-4311-91dc-8921ddacc8b0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 627.613552] env[60788]: DEBUG oslo_concurrency.lockutils [req-b2c32e01-4efc-488a-a3f1-f1b7273a38a9 req-7c062444-a8a3-496f-b1ae-1a743325db87 service nova] Acquiring lock "aa3bf189-1b7a-40eb-a270-711920dd84a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 627.614388] env[60788]: DEBUG oslo_concurrency.lockutils [req-b2c32e01-4efc-488a-a3f1-f1b7273a38a9 req-7c062444-a8a3-496f-b1ae-1a743325db87 service nova] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 627.615240] env[60788]: DEBUG oslo_concurrency.lockutils [req-b2c32e01-4efc-488a-a3f1-f1b7273a38a9 req-7c062444-a8a3-496f-b1ae-1a743325db87 service nova] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 627.618063] env[60788]: DEBUG nova.compute.manager [req-b2c32e01-4efc-488a-a3f1-f1b7273a38a9 req-7c062444-a8a3-496f-b1ae-1a743325db87 service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] No waiting events found dispatching network-vif-plugged-151db35a-ec2e-4311-91dc-8921ddacc8b0 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 627.618063] env[60788]: WARNING nova.compute.manager [req-b2c32e01-4efc-488a-a3f1-f1b7273a38a9 req-7c062444-a8a3-496f-b1ae-1a743325db87 service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Received unexpected event network-vif-plugged-151db35a-ec2e-4311-91dc-8921ddacc8b0 for instance with vm_state building and task_state spawning. [ 627.736419] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Updating instance_info_cache with network_info: [{"id": "151db35a-ec2e-4311-91dc-8921ddacc8b0", "address": "fa:16:3e:ed:69:f5", "network": {"id": "43370e97-dc16-4d29-8811-2aa769c1960f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543926798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b67562289dd6472d972bc0ec5c184f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "25f42474-5594-4733-a681-6c69f4afb946", "external-id": "nsx-vlan-transportzone-453", "segmentation_id": 453, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap151db35a-ec", "ovs_interfaceid": "151db35a-ec2e-4311-91dc-8921ddacc8b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.749783] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Releasing lock "refresh_cache-aa3bf189-1b7a-40eb-a270-711920dd84a6" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 627.750154] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance network_info: |[{"id": "151db35a-ec2e-4311-91dc-8921ddacc8b0", "address": "fa:16:3e:ed:69:f5", "network": {"id": "43370e97-dc16-4d29-8811-2aa769c1960f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543926798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b67562289dd6472d972bc0ec5c184f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "25f42474-5594-4733-a681-6c69f4afb946", "external-id": "nsx-vlan-transportzone-453", "segmentation_id": 453, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap151db35a-ec", "ovs_interfaceid": "151db35a-ec2e-4311-91dc-8921ddacc8b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 627.750553] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ed:69:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '25f42474-5594-4733-a681-6c69f4afb946', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '151db35a-ec2e-4311-91dc-8921ddacc8b0', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 627.758697] env[60788]: DEBUG oslo.service.loopingcall [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 627.759418] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 627.759652] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-018036a7-cc47-4774-b7a3-9c261d9c5e78 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 627.780764] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 627.780764] env[60788]: value = "task-2205131" [ 627.780764] env[60788]: _type = "Task" [ 627.780764] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 627.788789] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205131, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 628.291252] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205131, 'name': CreateVM_Task, 'duration_secs': 0.292771} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 628.291427] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 628.292177] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 628.292350] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 628.292670] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 628.292917] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-337a4746-a574-44c0-bd63-4ddba6029d40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 628.297816] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 628.297816] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52f7fc78-6949-a6ee-81e5-df94f91b6b80" [ 628.297816] env[60788]: _type = "Task" [ 628.297816] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 628.305982] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52f7fc78-6949-a6ee-81e5-df94f91b6b80, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 628.810454] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 628.810760] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 628.810996] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 629.759585] env[60788]: DEBUG nova.compute.manager [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Received event network-changed-151db35a-ec2e-4311-91dc-8921ddacc8b0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 629.759777] env[60788]: DEBUG nova.compute.manager [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Refreshing instance network info cache due to event network-changed-151db35a-ec2e-4311-91dc-8921ddacc8b0. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 629.760035] env[60788]: DEBUG oslo_concurrency.lockutils [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] Acquiring lock "refresh_cache-aa3bf189-1b7a-40eb-a270-711920dd84a6" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 629.760197] env[60788]: DEBUG oslo_concurrency.lockutils [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] Acquired lock "refresh_cache-aa3bf189-1b7a-40eb-a270-711920dd84a6" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 629.760381] env[60788]: DEBUG nova.network.neutron [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Refreshing network info cache for port 151db35a-ec2e-4311-91dc-8921ddacc8b0 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 630.318324] env[60788]: DEBUG nova.network.neutron [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Updated VIF entry in instance network info cache for port 151db35a-ec2e-4311-91dc-8921ddacc8b0. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 630.318698] env[60788]: DEBUG nova.network.neutron [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Updating instance_info_cache with network_info: [{"id": "151db35a-ec2e-4311-91dc-8921ddacc8b0", "address": "fa:16:3e:ed:69:f5", "network": {"id": "43370e97-dc16-4d29-8811-2aa769c1960f", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-543926798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b67562289dd6472d972bc0ec5c184f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "25f42474-5594-4733-a681-6c69f4afb946", "external-id": "nsx-vlan-transportzone-453", "segmentation_id": 453, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap151db35a-ec", "ovs_interfaceid": "151db35a-ec2e-4311-91dc-8921ddacc8b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.328887] env[60788]: DEBUG oslo_concurrency.lockutils [req-75132948-34e2-4d8b-a35a-9c7910fa24ee req-ba0e2b70-9fd9-4227-852d-066c0c78871b service nova] Releasing lock "refresh_cache-aa3bf189-1b7a-40eb-a270-711920dd84a6" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 643.252792] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.253098] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 648.753633] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 648.753944] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 648.754021] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 649.754318] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 649.754592] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 649.754592] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 649.777580] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.778876] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.778876] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.778876] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.778876] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.778876] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.779103] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.779103] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.779103] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.779103] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 649.779103] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 650.753555] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 650.753802] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 651.753387] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 651.753725] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 651.753959] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 651.768899] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.769603] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.769837] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 651.770065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 651.771199] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc742f78-0c39-4b2d-a732-c5e3e0a4818f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.780751] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42fbd463-4bad-4a80-93b7-5514e10957fa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.796534] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-213d6162-372b-4a10-9fff-8290f6094256 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.803320] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24a49799-68a8-4823-8b8e-dda8bb498162 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.835386] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181215MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 651.835542] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 651.835738] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 651.914135] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f88189b2-070f-4529-af1b-67c8d9b271a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914135] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914135] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914135] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914364] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb58b00e-1a78-4750-b912-48c94144ea66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914364] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 331fc548-2076-48e2-a84b-94130a99c2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914364] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914364] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914485] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.914485] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 651.925228] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.939032] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.947816] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c7380613-f621-4c56-9e8f-6b4f8dfe3ef1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.958831] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dc9706cc-1c7e-4570-8607-20120306153c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.968581] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 9aec9e50-0470-43f7-98b2-3f2eac50e6bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.978104] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 929a6bc2-109d-4753-8b98-8155c0e4e839 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.987957] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ee5957af-d4c1-4c71-82ba-83c06eb08869 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 651.997826] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 951453a7-a034-4111-9c5c-71d5c25245ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.009783] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c27066c4-1fbb-4918-94c4-62a8bf1c2dda has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.020594] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a92343b0-aad3-4416-9305-1432b35ae1a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.029821] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance cd2ac191-ea52-43cd-a20b-87b963112818 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.043139] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 008c517b-5838-43a0-aad3-5c7436d00275 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.052843] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1e6f4b96-7207-4208-9193-b0d207b1c703 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.062378] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e4bfd8ff-c503-420a-8a89-34c652d9fb2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.071817] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5f6d54d4-6862-46b0-9558-5db0c5b392d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.081818] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 977ea808-4e2d-4388-a5af-93048b5754e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.091735] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.101612] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6f437d9f-f904-46ab-9dc6-9902c2ea4c71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.111214] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ef9990e1-e0a7-41c0-b738-de213fd7046a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.121609] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d159fbfa-f391-41f8-97ba-eb145eed26e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.132369] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 064e7e7c-eeca-4822-9d5a-148b9fbdc1f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.142533] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4e86b919-f5f8-458c-a588-bd08bdcccf3b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.153407] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e991879c-de94-4e14-9480-95c95bcaaa05 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.168073] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 652.168073] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 652.168073] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 652.528458] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-112626b8-6b3f-480a-9fb4-99d3545ecaa5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.536195] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a70e6456-ad28-40ec-95a0-3805bcb89437 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.565642] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-058c3e64-6aa1-40b8-9eef-61e4d653e637 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.572884] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b78ab81-6ad8-4199-b63d-70b2537ddf75 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.585771] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.596948] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 652.610028] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 652.610119] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.774s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.610339] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 671.661168] env[60788]: WARNING oslo_vmware.rw_handles [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 671.661168] env[60788]: ERROR oslo_vmware.rw_handles [ 671.661746] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 671.663301] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 671.663568] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Copying Virtual Disk [datastore2] vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/bfa14001-2a92-4042-aab2-e31585925865/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 671.663876] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-32012f10-5024-4a0e-92f5-75a8f1d5ec82 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 671.672674] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Waiting for the task: (returnval){ [ 671.672674] env[60788]: value = "task-2205132" [ 671.672674] env[60788]: _type = "Task" [ 671.672674] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 671.684604] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Task: {'id': task-2205132, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 672.183398] env[60788]: DEBUG oslo_vmware.exceptions [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 672.183689] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 672.184268] env[60788]: ERROR nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 672.184268] env[60788]: Faults: ['InvalidArgument'] [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Traceback (most recent call last): [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] yield resources [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self.driver.spawn(context, instance, image_meta, [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self._fetch_image_if_missing(context, vi) [ 672.184268] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] image_cache(vi, tmp_image_ds_loc) [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] vm_util.copy_virtual_disk( [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] session._wait_for_task(vmdk_copy_task) [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] return self.wait_for_task(task_ref) [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] return evt.wait() [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] result = hub.switch() [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 672.184573] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] return self.greenlet.switch() [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self.f(*self.args, **self.kw) [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] raise exceptions.translate_fault(task_info.error) [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Faults: ['InvalidArgument'] [ 672.184865] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] [ 672.184865] env[60788]: INFO nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Terminating instance [ 672.186130] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 672.186347] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 672.186531] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d4bedb16-a2f8-4baf-b3f5-507c3793b408 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.188716] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 672.188904] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 672.189661] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebb0ccf3-2554-4008-8672-843a19c62c09 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.196470] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 672.196680] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-158fff35-2fb6-4741-95b6-89845c1f4bf5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.198852] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 672.199032] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 672.200016] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5c24f32f-c21f-419c-8b53-75af2489fab8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.204451] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 672.204451] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52544308-997c-1351-a0db-c749e371c3c4" [ 672.204451] env[60788]: _type = "Task" [ 672.204451] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 672.211536] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52544308-997c-1351-a0db-c749e371c3c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 672.261797] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 672.262024] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 672.262156] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Deleting the datastore file [datastore2] 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 672.262430] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7b8bb48f-068e-4ced-801e-752da75fe9fa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.268203] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Waiting for the task: (returnval){ [ 672.268203] env[60788]: value = "task-2205134" [ 672.268203] env[60788]: _type = "Task" [ 672.268203] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 672.276592] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Task: {'id': task-2205134, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 672.714053] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 672.714328] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating directory with path [datastore2] vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 672.714565] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-494eeed9-cb0f-44e6-9c59-fee64b096afa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.725884] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Created directory with path [datastore2] vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 672.726270] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Fetch image to [datastore2] vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 672.726270] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 672.726996] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e329eec7-ac69-4941-922f-24e55ebf9fd8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.733979] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42fa1e31-54c8-4682-afd0-a4340f8f891c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.742631] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bbcd6e9-e1fc-400d-9c23-b969e74bcab8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.775847] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b603778-ff13-4cfa-a944-27cacc3afadb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.783493] env[60788]: DEBUG oslo_vmware.api [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Task: {'id': task-2205134, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071471} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 672.784998] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 672.785213] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 672.785390] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 672.785564] env[60788]: INFO nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 672.787517] env[60788]: DEBUG nova.compute.claims [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 672.787681] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 672.787904] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 672.790323] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4bed89b8-5109-475b-b887-37d8453d8385 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 672.810982] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 672.887278] env[60788]: DEBUG oslo_vmware.rw_handles [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 672.950893] env[60788]: DEBUG oslo_vmware.rw_handles [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 672.950893] env[60788]: DEBUG oslo_vmware.rw_handles [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 673.289700] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8e49e43-1ecf-49a5-810e-9a596aa07d2d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.297666] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04bcd920-090b-43cb-9492-6656a1fe65d9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.327526] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0776c61-45e0-48fe-804a-ea1135d5a3cc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.335198] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c1bcf00-1e28-4b32-b747-767c2965b1de {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 673.349356] env[60788]: DEBUG nova.compute.provider_tree [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 673.358405] env[60788]: DEBUG nova.scheduler.client.report [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 673.374966] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.587s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.375494] env[60788]: ERROR nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 673.375494] env[60788]: Faults: ['InvalidArgument'] [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Traceback (most recent call last): [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self.driver.spawn(context, instance, image_meta, [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self._fetch_image_if_missing(context, vi) [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] image_cache(vi, tmp_image_ds_loc) [ 673.375494] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] vm_util.copy_virtual_disk( [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] session._wait_for_task(vmdk_copy_task) [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] return self.wait_for_task(task_ref) [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] return evt.wait() [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] result = hub.switch() [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] return self.greenlet.switch() [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 673.375919] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] self.f(*self.args, **self.kw) [ 673.376391] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 673.376391] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] raise exceptions.translate_fault(task_info.error) [ 673.376391] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 673.376391] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Faults: ['InvalidArgument'] [ 673.376391] env[60788]: ERROR nova.compute.manager [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] [ 673.376391] env[60788]: DEBUG nova.compute.utils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 673.377597] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Build of instance 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2 was re-scheduled: A specified parameter was not correct: fileType [ 673.377597] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 673.377962] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 673.378149] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 673.378305] env[60788]: DEBUG nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 673.378468] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 673.825256] env[60788]: DEBUG nova.network.neutron [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.838954] env[60788]: INFO nova.compute.manager [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] [instance: 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2] Took 0.46 seconds to deallocate network for instance. [ 673.939680] env[60788]: INFO nova.scheduler.client.report [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Deleted allocations for instance 1091a5ac-788a-4a8b-8f29-ad766fe5ffa2 [ 673.977210] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6481de3e-2e2e-4114-aa26-74c4009cce37 tempest-ServerDiagnosticsNegativeTest-2016576589 tempest-ServerDiagnosticsNegativeTest-2016576589-project-member] Lock "1091a5ac-788a-4a8b-8f29-ad766fe5ffa2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 153.635s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 673.991806] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 674.047039] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 674.047325] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 674.049793] env[60788]: INFO nova.compute.claims [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 674.471051] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd0eae60-4fc9-46ba-b73d-c9e35b0d144c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.478792] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a486995-3bc6-4071-8026-7179db27c9b6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.508272] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9506a744-e97f-4975-b044-cec6204568a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.515423] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae2cddcc-2bd5-4a50-b302-b983948f07e4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.528585] env[60788]: DEBUG nova.compute.provider_tree [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 674.539048] env[60788]: DEBUG nova.scheduler.client.report [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 674.552851] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.505s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 674.553342] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 674.590422] env[60788]: DEBUG nova.compute.utils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 674.591874] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 674.592193] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 674.600377] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 674.674090] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 674.705732] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 674.705998] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 674.706174] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 674.706393] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 674.706500] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 674.706646] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 674.706853] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 674.707090] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 674.707291] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 674.707539] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 674.707718] env[60788]: DEBUG nova.virt.hardware [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 674.708596] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3fc4cd9-2f2a-450b-a83f-eba8862eafd3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 674.712513] env[60788]: DEBUG nova.policy [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b520d5b9d7e448899e9d98f616e8d5a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c3941c55e704c1888cf802dc49eda52', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 674.720494] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b87c035-2f0e-4676-91fd-60705da59fc7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 675.353283] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Successfully created port: 8421d4cc-d418-4cd2-bf77-41b5e6ff695f {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 676.367172] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Successfully updated port: 8421d4cc-d418-4cd2-bf77-41b5e6ff695f {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 676.388202] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "refresh_cache-0259d811-2677-4164-94cd-5c4f5d935f50" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 676.388369] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquired lock "refresh_cache-0259d811-2677-4164-94cd-5c4f5d935f50" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 676.388524] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 676.459091] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 676.473074] env[60788]: DEBUG nova.compute.manager [req-39d1ebd8-df65-4eb5-9b45-62c75ae1f136 req-753aadeb-1473-47e4-83b4-00fd9f87ee17 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Received event network-vif-plugged-8421d4cc-d418-4cd2-bf77-41b5e6ff695f {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 676.473074] env[60788]: DEBUG oslo_concurrency.lockutils [req-39d1ebd8-df65-4eb5-9b45-62c75ae1f136 req-753aadeb-1473-47e4-83b4-00fd9f87ee17 service nova] Acquiring lock "0259d811-2677-4164-94cd-5c4f5d935f50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 676.473074] env[60788]: DEBUG oslo_concurrency.lockutils [req-39d1ebd8-df65-4eb5-9b45-62c75ae1f136 req-753aadeb-1473-47e4-83b4-00fd9f87ee17 service nova] Lock "0259d811-2677-4164-94cd-5c4f5d935f50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 676.473074] env[60788]: DEBUG oslo_concurrency.lockutils [req-39d1ebd8-df65-4eb5-9b45-62c75ae1f136 req-753aadeb-1473-47e4-83b4-00fd9f87ee17 service nova] Lock "0259d811-2677-4164-94cd-5c4f5d935f50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 676.473279] env[60788]: DEBUG nova.compute.manager [req-39d1ebd8-df65-4eb5-9b45-62c75ae1f136 req-753aadeb-1473-47e4-83b4-00fd9f87ee17 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] No waiting events found dispatching network-vif-plugged-8421d4cc-d418-4cd2-bf77-41b5e6ff695f {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 676.473279] env[60788]: WARNING nova.compute.manager [req-39d1ebd8-df65-4eb5-9b45-62c75ae1f136 req-753aadeb-1473-47e4-83b4-00fd9f87ee17 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Received unexpected event network-vif-plugged-8421d4cc-d418-4cd2-bf77-41b5e6ff695f for instance with vm_state building and task_state spawning. [ 677.001191] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Updating instance_info_cache with network_info: [{"id": "8421d4cc-d418-4cd2-bf77-41b5e6ff695f", "address": "fa:16:3e:9c:3b:ef", "network": {"id": "4d1875c6-31c0-4feb-96cb-5dee97eef150", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1647175262-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1c3941c55e704c1888cf802dc49eda52", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8421d4cc-d4", "ovs_interfaceid": "8421d4cc-d418-4cd2-bf77-41b5e6ff695f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 677.013097] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Releasing lock "refresh_cache-0259d811-2677-4164-94cd-5c4f5d935f50" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 677.013410] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance network_info: |[{"id": "8421d4cc-d418-4cd2-bf77-41b5e6ff695f", "address": "fa:16:3e:9c:3b:ef", "network": {"id": "4d1875c6-31c0-4feb-96cb-5dee97eef150", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1647175262-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1c3941c55e704c1888cf802dc49eda52", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8421d4cc-d4", "ovs_interfaceid": "8421d4cc-d418-4cd2-bf77-41b5e6ff695f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 677.013828] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9c:3b:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ee9ce73d-4ee8-4b28-b7d3-3a5735039627', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8421d4cc-d418-4cd2-bf77-41b5e6ff695f', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 677.022489] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Creating folder: Project (1c3941c55e704c1888cf802dc49eda52). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 677.023056] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5931c41-1f06-4f21-ad29-2c41ec124857 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 677.035419] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Created folder: Project (1c3941c55e704c1888cf802dc49eda52) in parent group-v449747. [ 677.035627] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Creating folder: Instances. Parent ref: group-v449782. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 677.035855] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b46f60f6-c384-4a6a-9b18-59440d449e3e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 677.044485] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Created folder: Instances in parent group-v449782. [ 677.044727] env[60788]: DEBUG oslo.service.loopingcall [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 677.044911] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 677.045166] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aedd1f75-79cb-4e48-958f-be802200dbd8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 677.063872] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 677.063872] env[60788]: value = "task-2205137" [ 677.063872] env[60788]: _type = "Task" [ 677.063872] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 677.071524] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205137, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 677.574988] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205137, 'name': CreateVM_Task, 'duration_secs': 0.278497} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 677.575228] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 677.575760] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 677.575986] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 677.576302] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 677.576550] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1d230ac5-eb6e-4c50-ad03-af64c5c02bb5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 677.582056] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Waiting for the task: (returnval){ [ 677.582056] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]529597dc-0a29-100d-5ec4-cc7be181b8c8" [ 677.582056] env[60788]: _type = "Task" [ 677.582056] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 677.590707] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]529597dc-0a29-100d-5ec4-cc7be181b8c8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 678.092664] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 678.092927] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 678.093152] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.564734] env[60788]: DEBUG nova.compute.manager [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Received event network-changed-8421d4cc-d418-4cd2-bf77-41b5e6ff695f {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 678.564828] env[60788]: DEBUG nova.compute.manager [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Refreshing instance network info cache due to event network-changed-8421d4cc-d418-4cd2-bf77-41b5e6ff695f. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 678.565060] env[60788]: DEBUG oslo_concurrency.lockutils [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] Acquiring lock "refresh_cache-0259d811-2677-4164-94cd-5c4f5d935f50" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 678.565202] env[60788]: DEBUG oslo_concurrency.lockutils [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] Acquired lock "refresh_cache-0259d811-2677-4164-94cd-5c4f5d935f50" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 678.565828] env[60788]: DEBUG nova.network.neutron [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Refreshing network info cache for port 8421d4cc-d418-4cd2-bf77-41b5e6ff695f {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 679.054974] env[60788]: DEBUG nova.network.neutron [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Updated VIF entry in instance network info cache for port 8421d4cc-d418-4cd2-bf77-41b5e6ff695f. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 679.055371] env[60788]: DEBUG nova.network.neutron [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Updating instance_info_cache with network_info: [{"id": "8421d4cc-d418-4cd2-bf77-41b5e6ff695f", "address": "fa:16:3e:9c:3b:ef", "network": {"id": "4d1875c6-31c0-4feb-96cb-5dee97eef150", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1647175262-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1c3941c55e704c1888cf802dc49eda52", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee9ce73d-4ee8-4b28-b7d3-3a5735039627", "external-id": "cl2-zone-465", "segmentation_id": 465, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8421d4cc-d4", "ovs_interfaceid": "8421d4cc-d418-4cd2-bf77-41b5e6ff695f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 679.064238] env[60788]: DEBUG oslo_concurrency.lockutils [req-2327be45-9004-4083-9abd-ad86d212c563 req-25c10703-1493-4734-89a3-ba3e22f05dc0 service nova] Releasing lock "refresh_cache-0259d811-2677-4164-94cd-5c4f5d935f50" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 683.517376] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 683.517645] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 708.750579] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.753549] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 709.753892] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 709.753892] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 709.773485] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.773676] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.773855] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774031] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774206] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774367] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774523] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774667] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774779] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.774898] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 709.775032] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 709.775540] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 710.753619] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 710.753872] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 711.749204] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 711.753875] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 711.768274] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.768515] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.768774] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 711.768977] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 711.770217] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98213135-1695-4515-8b47-e525b1d35d37 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.779264] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-016bdb0b-d22b-47cc-8bae-4d19f4f4f282 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.794358] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c434855-b2e1-4a08-8792-536eb3d33cf4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.800685] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77cf5500-9a42-49b9-b745-ed58d5a9e7c1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.830308] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181224MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 711.830466] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.830665] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.906830] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f88189b2-070f-4529-af1b-67c8d9b271a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.906830] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.906830] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.906830] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb58b00e-1a78-4750-b912-48c94144ea66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.907049] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 331fc548-2076-48e2-a84b-94130a99c2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.907049] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.907049] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.907049] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.907155] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.907155] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 711.918265] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.928458] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c7380613-f621-4c56-9e8f-6b4f8dfe3ef1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.938570] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dc9706cc-1c7e-4570-8607-20120306153c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.948880] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 9aec9e50-0470-43f7-98b2-3f2eac50e6bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.958408] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 929a6bc2-109d-4753-8b98-8155c0e4e839 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.967353] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ee5957af-d4c1-4c71-82ba-83c06eb08869 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.975992] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 951453a7-a034-4111-9c5c-71d5c25245ff has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.984571] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c27066c4-1fbb-4918-94c4-62a8bf1c2dda has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 711.992958] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a92343b0-aad3-4416-9305-1432b35ae1a8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.001632] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance cd2ac191-ea52-43cd-a20b-87b963112818 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.010445] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 008c517b-5838-43a0-aad3-5c7436d00275 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.019109] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1e6f4b96-7207-4208-9193-b0d207b1c703 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.027478] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e4bfd8ff-c503-420a-8a89-34c652d9fb2c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.035675] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5f6d54d4-6862-46b0-9558-5db0c5b392d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.044229] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 977ea808-4e2d-4388-a5af-93048b5754e3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.053505] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.061860] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6f437d9f-f904-46ab-9dc6-9902c2ea4c71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.070324] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ef9990e1-e0a7-41c0-b738-de213fd7046a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.079394] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d159fbfa-f391-41f8-97ba-eb145eed26e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.088100] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 064e7e7c-eeca-4822-9d5a-148b9fbdc1f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.096714] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4e86b919-f5f8-458c-a588-bd08bdcccf3b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.105385] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e991879c-de94-4e14-9480-95c95bcaaa05 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.113957] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.122654] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 712.122896] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 712.123056] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 712.505215] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90c8182b-cd0e-48ee-94c9-31696905404b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.513335] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a06a443-3bf4-4f11-b82c-2f69c0d77834 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.544204] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1175ecf-00b9-47a8-a7bf-224bc831b1ca {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.552462] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c82bb4-54b6-46c7-80f7-b8269869530b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 712.565841] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.574499] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.588302] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 712.588502] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.758s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 713.589221] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 713.589511] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 713.753931] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 714.753974] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 718.748576] env[60788]: DEBUG oslo_concurrency.lockutils [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "f88189b2-070f-4529-af1b-67c8d9b271a8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 720.613489] env[60788]: WARNING oslo_vmware.rw_handles [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 720.613489] env[60788]: ERROR oslo_vmware.rw_handles [ 720.614081] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 720.615260] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 720.615506] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Copying Virtual Disk [datastore2] vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/585dfb5f-0fe7-4c0a-84d1-d23585b2eb35/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 720.615783] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2108efb8-641a-4a50-b16a-d73df9f91d21 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 720.624745] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 720.624745] env[60788]: value = "task-2205138" [ 720.624745] env[60788]: _type = "Task" [ 720.624745] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 720.632793] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': task-2205138, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 721.135416] env[60788]: DEBUG oslo_vmware.exceptions [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 721.135686] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 721.136252] env[60788]: ERROR nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 721.136252] env[60788]: Faults: ['InvalidArgument'] [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Traceback (most recent call last): [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] yield resources [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self.driver.spawn(context, instance, image_meta, [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self._fetch_image_if_missing(context, vi) [ 721.136252] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] image_cache(vi, tmp_image_ds_loc) [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] vm_util.copy_virtual_disk( [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] session._wait_for_task(vmdk_copy_task) [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] return self.wait_for_task(task_ref) [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] return evt.wait() [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] result = hub.switch() [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 721.136926] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] return self.greenlet.switch() [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self.f(*self.args, **self.kw) [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] raise exceptions.translate_fault(task_info.error) [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Faults: ['InvalidArgument'] [ 721.137559] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] [ 721.137559] env[60788]: INFO nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Terminating instance [ 721.138120] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 721.138334] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 721.138568] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d4e53af1-0c0a-4fb3-b929-558ee3a5fc21 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.140775] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 721.140960] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 721.141720] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f979144f-5f21-4556-ac2f-40a1c4fb319a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.149519] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 721.149774] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-31bc0f42-a26e-452f-8de5-2bf9bdc8aed3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.152338] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 721.152519] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 721.153563] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-840f18c5-9eae-4b9c-b2c1-9d0ca2d12f40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.158450] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Waiting for the task: (returnval){ [ 721.158450] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]520e3a2d-5106-7129-b2b7-c75bdc7c2ea6" [ 721.158450] env[60788]: _type = "Task" [ 721.158450] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 721.170275] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]520e3a2d-5106-7129-b2b7-c75bdc7c2ea6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 721.228080] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 721.228357] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 721.228547] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Deleting the datastore file [datastore2] f88189b2-070f-4529-af1b-67c8d9b271a8 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 721.228811] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d401e6b0-0d29-4215-a9b7-3b7b7c3840e8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.235253] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 721.235253] env[60788]: value = "task-2205140" [ 721.235253] env[60788]: _type = "Task" [ 721.235253] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 721.243119] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': task-2205140, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 721.668390] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 721.668761] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Creating directory with path [datastore2] vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 721.669824] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-414461f2-4956-49c8-bc41-4c5b987fa653 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.685484] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Created directory with path [datastore2] vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 721.685718] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Fetch image to [datastore2] vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 721.685903] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 721.686724] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cedd26d9-5d69-4bec-ad6d-6541b19f6ee1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.693575] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04754577-1034-4157-b9de-8f70689d8f76 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.703040] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba0feb53-e9de-4fe4-9a4c-c2e5314ddaa6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.734521] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5aaa62c-2e9d-483c-bc78-c69733a71c5c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.745286] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-066eb916-8ef2-4847-9bdf-3ed857ace391 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 721.746997] env[60788]: DEBUG oslo_vmware.api [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': task-2205140, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065948} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 721.747286] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 721.747465] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 721.747589] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 721.747763] env[60788]: INFO nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Took 0.61 seconds to destroy the instance on the hypervisor. [ 721.749799] env[60788]: DEBUG nova.compute.claims [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 721.749972] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 721.750218] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 721.773847] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 721.838438] env[60788]: DEBUG oslo_vmware.rw_handles [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 721.906219] env[60788]: DEBUG oslo_vmware.rw_handles [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 721.906451] env[60788]: DEBUG oslo_vmware.rw_handles [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 722.263865] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eab6d74-0826-4910-ba82-d9c66bab1634 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.269257] env[60788]: DEBUG oslo_concurrency.lockutils [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 722.272750] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58a35b87-02a8-409f-95e6-3fdf16a752b3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.303070] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61ab19e3-83e6-4399-bebb-060adb92df94 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.309943] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcbd1b28-f7b5-41ff-b021-40d7dbf62593 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 722.324188] env[60788]: DEBUG nova.compute.provider_tree [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 722.332777] env[60788]: DEBUG nova.scheduler.client.report [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 722.346010] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.596s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 722.346542] env[60788]: ERROR nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 722.346542] env[60788]: Faults: ['InvalidArgument'] [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Traceback (most recent call last): [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self.driver.spawn(context, instance, image_meta, [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self._fetch_image_if_missing(context, vi) [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] image_cache(vi, tmp_image_ds_loc) [ 722.346542] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] vm_util.copy_virtual_disk( [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] session._wait_for_task(vmdk_copy_task) [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] return self.wait_for_task(task_ref) [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] return evt.wait() [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] result = hub.switch() [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] return self.greenlet.switch() [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 722.346909] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] self.f(*self.args, **self.kw) [ 722.347289] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 722.347289] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] raise exceptions.translate_fault(task_info.error) [ 722.347289] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 722.347289] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Faults: ['InvalidArgument'] [ 722.347289] env[60788]: ERROR nova.compute.manager [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] [ 722.347289] env[60788]: DEBUG nova.compute.utils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 722.348698] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Build of instance f88189b2-070f-4529-af1b-67c8d9b271a8 was re-scheduled: A specified parameter was not correct: fileType [ 722.348698] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 722.349089] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 722.349265] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 722.349423] env[60788]: DEBUG nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 722.349581] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.888783] env[60788]: DEBUG nova.network.neutron [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.899349] env[60788]: INFO nova.compute.manager [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Took 0.55 seconds to deallocate network for instance. [ 722.997874] env[60788]: INFO nova.scheduler.client.report [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Deleted allocations for instance f88189b2-070f-4529-af1b-67c8d9b271a8 [ 723.019337] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1c4f9c89-c040-4abb-9d33-27c81edb94d5 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.297s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.021167] env[60788]: DEBUG oslo_concurrency.lockutils [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 4.272s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.021167] env[60788]: DEBUG oslo_concurrency.lockutils [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "f88189b2-070f-4529-af1b-67c8d9b271a8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 723.021167] env[60788]: DEBUG oslo_concurrency.lockutils [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.021432] env[60788]: DEBUG oslo_concurrency.lockutils [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.023035] env[60788]: INFO nova.compute.manager [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Terminating instance [ 723.024756] env[60788]: DEBUG nova.compute.manager [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 723.024933] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 723.025413] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9d41016b-d7f3-4079-be64-325cc60c5051 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.035084] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3cf276e-a9d1-4d6d-ad92-fdf4dfae6944 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.046380] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 723.066502] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f88189b2-070f-4529-af1b-67c8d9b271a8 could not be found. [ 723.066792] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 723.066977] env[60788]: INFO nova.compute.manager [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 723.067238] env[60788]: DEBUG oslo.service.loopingcall [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 723.067446] env[60788]: DEBUG nova.compute.manager [-] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 723.067542] env[60788]: DEBUG nova.network.neutron [-] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.102388] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 723.102586] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 723.104321] env[60788]: INFO nova.compute.claims [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 723.106800] env[60788]: DEBUG nova.network.neutron [-] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.114433] env[60788]: INFO nova.compute.manager [-] [instance: f88189b2-070f-4529-af1b-67c8d9b271a8] Took 0.05 seconds to deallocate network for instance. [ 723.238198] env[60788]: DEBUG oslo_concurrency.lockutils [None req-07dc4f0b-e924-4829-abea-40e7965bae61 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "f88189b2-070f-4529-af1b-67c8d9b271a8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.218s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.478202] env[60788]: DEBUG oslo_concurrency.lockutils [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "fb58b00e-1a78-4750-b912-48c94144ea66" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 723.590980] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fea7570-1a64-4597-9ada-1467b0a87a3e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.598730] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0c12521-0e15-4a91-a743-405404a38da5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.629681] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d363b6c-8f54-4646-9abb-4d7622ba5bf3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.637182] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfbc04b9-c69b-4e13-aeb3-d0beaa6c3883 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.650683] env[60788]: DEBUG nova.compute.provider_tree [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.663389] env[60788]: DEBUG nova.scheduler.client.report [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.679559] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.577s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 723.680053] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 723.721520] env[60788]: DEBUG nova.compute.utils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 723.723447] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 723.723685] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 723.732278] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 723.800131] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 723.814623] env[60788]: DEBUG nova.policy [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89b673319ed34de9859c0f58f1c616c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d4606e74dad40acba2d78ea01a69919', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 723.836264] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 723.836522] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 723.836677] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 723.836853] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 723.836995] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 723.837158] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 723.837365] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 723.837521] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 723.837682] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 723.837842] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 723.838015] env[60788]: DEBUG nova.virt.hardware [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 723.839235] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ecb9a16-a7cc-4cb4-a7fa-6e4a7e5522a9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 723.847683] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21350dfa-5cd7-4daa-871d-c749eb86722f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 724.093313] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 724.351078] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 724.351316] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 724.622235] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Successfully created port: f432305c-1ba2-42e4-b3bb-732a09f3fb9d {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 725.631579] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Successfully updated port: f432305c-1ba2-42e4-b3bb-732a09f3fb9d {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 725.642637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "refresh_cache-d0480645-be38-48de-9ae5-05c4eb0bf5d3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 725.642805] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "refresh_cache-d0480645-be38-48de-9ae5-05c4eb0bf5d3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 725.642953] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 725.703541] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.891448] env[60788]: DEBUG nova.compute.manager [req-fb86a930-a459-42a9-8150-95ce30973ee7 req-c4fb9603-87ef-48c7-8c5e-6373737cbda4 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Received event network-vif-plugged-f432305c-1ba2-42e4-b3bb-732a09f3fb9d {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 725.891680] env[60788]: DEBUG oslo_concurrency.lockutils [req-fb86a930-a459-42a9-8150-95ce30973ee7 req-c4fb9603-87ef-48c7-8c5e-6373737cbda4 service nova] Acquiring lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 725.892279] env[60788]: DEBUG oslo_concurrency.lockutils [req-fb86a930-a459-42a9-8150-95ce30973ee7 req-c4fb9603-87ef-48c7-8c5e-6373737cbda4 service nova] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 725.892279] env[60788]: DEBUG oslo_concurrency.lockutils [req-fb86a930-a459-42a9-8150-95ce30973ee7 req-c4fb9603-87ef-48c7-8c5e-6373737cbda4 service nova] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 725.892279] env[60788]: DEBUG nova.compute.manager [req-fb86a930-a459-42a9-8150-95ce30973ee7 req-c4fb9603-87ef-48c7-8c5e-6373737cbda4 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] No waiting events found dispatching network-vif-plugged-f432305c-1ba2-42e4-b3bb-732a09f3fb9d {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 725.892487] env[60788]: WARNING nova.compute.manager [req-fb86a930-a459-42a9-8150-95ce30973ee7 req-c4fb9603-87ef-48c7-8c5e-6373737cbda4 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Received unexpected event network-vif-plugged-f432305c-1ba2-42e4-b3bb-732a09f3fb9d for instance with vm_state building and task_state spawning. [ 726.000361] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Updating instance_info_cache with network_info: [{"id": "f432305c-1ba2-42e4-b3bb-732a09f3fb9d", "address": "fa:16:3e:64:18:21", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf432305c-1b", "ovs_interfaceid": "f432305c-1ba2-42e4-b3bb-732a09f3fb9d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.013564] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "refresh_cache-d0480645-be38-48de-9ae5-05c4eb0bf5d3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 726.013850] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance network_info: |[{"id": "f432305c-1ba2-42e4-b3bb-732a09f3fb9d", "address": "fa:16:3e:64:18:21", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf432305c-1b", "ovs_interfaceid": "f432305c-1ba2-42e4-b3bb-732a09f3fb9d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 726.014234] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:64:18:21', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f432305c-1ba2-42e4-b3bb-732a09f3fb9d', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 726.022057] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating folder: Project (8d4606e74dad40acba2d78ea01a69919). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.022731] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-448dc233-0cc7-4975-aff5-305528af5825 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 726.033169] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created folder: Project (8d4606e74dad40acba2d78ea01a69919) in parent group-v449747. [ 726.033799] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating folder: Instances. Parent ref: group-v449785. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 726.033799] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2fe53ee3-d99c-420f-9df5-610cea61ba40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 726.042561] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created folder: Instances in parent group-v449785. [ 726.042799] env[60788]: DEBUG oslo.service.loopingcall [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 726.042981] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 726.043219] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fbcbf491-00f4-4b5b-91b2-566e287b2ec1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 726.062058] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 726.062058] env[60788]: value = "task-2205143" [ 726.062058] env[60788]: _type = "Task" [ 726.062058] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 726.070586] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205143, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 726.573346] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205143, 'name': CreateVM_Task, 'duration_secs': 0.308737} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 726.573672] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 726.574404] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 726.574579] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 726.574903] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 726.575180] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-11f3a2d1-5de1-4da3-8ef9-fc428f7ac5db {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 726.580031] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 726.580031] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52a1b70d-78a9-c143-d18d-dd855dd9b4f1" [ 726.580031] env[60788]: _type = "Task" [ 726.580031] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 726.587916] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52a1b70d-78a9-c143-d18d-dd855dd9b4f1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 727.090589] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 727.090893] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 727.091153] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.010365] env[60788]: DEBUG nova.compute.manager [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Received event network-changed-f432305c-1ba2-42e4-b3bb-732a09f3fb9d {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 728.010602] env[60788]: DEBUG nova.compute.manager [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Refreshing instance network info cache due to event network-changed-f432305c-1ba2-42e4-b3bb-732a09f3fb9d. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 728.010821] env[60788]: DEBUG oslo_concurrency.lockutils [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] Acquiring lock "refresh_cache-d0480645-be38-48de-9ae5-05c4eb0bf5d3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 728.010890] env[60788]: DEBUG oslo_concurrency.lockutils [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] Acquired lock "refresh_cache-d0480645-be38-48de-9ae5-05c4eb0bf5d3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 728.011098] env[60788]: DEBUG nova.network.neutron [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Refreshing network info cache for port f432305c-1ba2-42e4-b3bb-732a09f3fb9d {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 728.483582] env[60788]: DEBUG nova.network.neutron [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Updated VIF entry in instance network info cache for port f432305c-1ba2-42e4-b3bb-732a09f3fb9d. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 728.483951] env[60788]: DEBUG nova.network.neutron [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Updating instance_info_cache with network_info: [{"id": "f432305c-1ba2-42e4-b3bb-732a09f3fb9d", "address": "fa:16:3e:64:18:21", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf432305c-1b", "ovs_interfaceid": "f432305c-1ba2-42e4-b3bb-732a09f3fb9d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.493415] env[60788]: DEBUG oslo_concurrency.lockutils [req-61d6465f-fad2-44e2-8c7f-af8621f143e8 req-ed784ca8-290b-41ee-9f76-f277a3c8bba8 service nova] Releasing lock "refresh_cache-d0480645-be38-48de-9ae5-05c4eb0bf5d3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 732.255679] env[60788]: DEBUG oslo_concurrency.lockutils [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "331fc548-2076-48e2-a84b-94130a99c2ca" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 735.646119] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "80e7296f-45ed-4987-9884-05bd883f4144" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 736.690176] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 739.314316] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.816709] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 741.180665] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "0259d811-2677-4164-94cd-5c4f5d935f50" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 750.894052] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8ac96037-c57f-46b4-9dea-9df48c62b07f tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Acquiring lock "6a101160-4c4a-42a5-9dfa-e7f41aa9788a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 750.894361] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8ac96037-c57f-46b4-9dea-9df48c62b07f tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Lock "6a101160-4c4a-42a5-9dfa-e7f41aa9788a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 756.338136] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4b9a2a04-2478-44aa-9c13-20e050488769 tempest-ServerRescueTestJSONUnderV235-1049840473 tempest-ServerRescueTestJSONUnderV235-1049840473-project-member] Acquiring lock "57c974dd-093c-44c1-ab08-1659e25bb392" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 756.338481] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4b9a2a04-2478-44aa-9c13-20e050488769 tempest-ServerRescueTestJSONUnderV235-1049840473 tempest-ServerRescueTestJSONUnderV235-1049840473-project-member] Lock "57c974dd-093c-44c1-ab08-1659e25bb392" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 760.697772] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f8ea6717-4f26-410a-a4f3-f6020418bb0b tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] Acquiring lock "d5972768-f55f-495b-a49f-43b00c4647c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.698141] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f8ea6717-4f26-410a-a4f3-f6020418bb0b tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] Lock "d5972768-f55f-495b-a49f-43b00c4647c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 763.963782] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f241914f-1c26-4436-8658-8f9894f2e61e tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] Acquiring lock "911b94d1-8c01-49fa-ae13-4565a028676e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 763.964104] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f241914f-1c26-4436-8658-8f9894f2e61e tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] Lock "911b94d1-8c01-49fa-ae13-4565a028676e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 768.755037] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 768.755037] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 768.765528] env[60788]: WARNING oslo_vmware.rw_handles [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 768.765528] env[60788]: ERROR oslo_vmware.rw_handles [ 768.765924] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 768.767577] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 768.767807] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Copying Virtual Disk [datastore2] vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/e4f45213-93a6-4fdf-b511-06dec7b5f7fd/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 768.768106] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4e1567e0-f021-4ed6-9026-c72d163f25d7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 768.775811] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] There are 0 instances to clean {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 768.776335] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 768.776335] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances with incomplete migration {{(pid=60788) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 768.780109] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Waiting for the task: (returnval){ [ 768.780109] env[60788]: value = "task-2205144" [ 768.780109] env[60788]: _type = "Task" [ 768.780109] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 768.790326] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Task: {'id': task-2205144, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 768.790735] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 769.297648] env[60788]: DEBUG oslo_vmware.exceptions [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 769.297648] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 769.297648] env[60788]: ERROR nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 769.297648] env[60788]: Faults: ['InvalidArgument'] [ 769.297648] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Traceback (most recent call last): [ 769.297648] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 769.297648] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] yield resources [ 769.297648] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 769.297648] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self.driver.spawn(context, instance, image_meta, [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self._vmops.spawn(context, instance, image_meta, injected_files, [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self._fetch_image_if_missing(context, vi) [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] image_cache(vi, tmp_image_ds_loc) [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] vm_util.copy_virtual_disk( [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] session._wait_for_task(vmdk_copy_task) [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] return self.wait_for_task(task_ref) [ 769.298259] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] return evt.wait() [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] result = hub.switch() [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] return self.greenlet.switch() [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self.f(*self.args, **self.kw) [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] raise exceptions.translate_fault(task_info.error) [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Faults: ['InvalidArgument'] [ 769.298654] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] [ 769.299032] env[60788]: INFO nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Terminating instance [ 769.299032] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 769.299032] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 769.299032] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-efa2b3bd-9bea-4286-ba61-101feab5046b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.300650] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 769.300863] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 769.302074] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a33e548-0e27-48f1-94cc-deb2fc52a445 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.306256] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 769.306468] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 769.307951] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-704058f2-707d-444b-b732-7490e6223e50 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.312962] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 769.313765] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-baefbb0f-b19e-46de-8af4-1756deb9f414 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.316700] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Waiting for the task: (returnval){ [ 769.316700] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52947169-4911-8f30-48f1-55080199de9d" [ 769.316700] env[60788]: _type = "Task" [ 769.316700] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 769.325497] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52947169-4911-8f30-48f1-55080199de9d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 769.398518] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 769.398769] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 769.398986] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Deleting the datastore file [datastore2] e883e763-d7c1-4eae-af6e-4a4e4a84e323 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 769.399311] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9e772078-4968-477a-bc98-447c23590c5f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.405884] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Waiting for the task: (returnval){ [ 769.405884] env[60788]: value = "task-2205146" [ 769.405884] env[60788]: _type = "Task" [ 769.405884] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 769.414364] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Task: {'id': task-2205146, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 769.810481] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 769.834518] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 769.834518] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Creating directory with path [datastore2] vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 769.834518] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b6489f03-f6a9-4f35-867e-5273674ea3c5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.870651] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Created directory with path [datastore2] vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 769.870651] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Fetch image to [datastore2] vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 769.870651] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 769.870651] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd67aeb3-3663-4bb6-9471-1229f274ab93 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.884020] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-200a41ec-219e-4ab1-8cd7-16e3b5668bbd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.891861] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-912ceebc-7e8d-4bfc-85f7-eabd0fc4f576 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.933899] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c5e2d0-2f58-4a42-908a-820e5b476172 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.941823] env[60788]: DEBUG oslo_vmware.api [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Task: {'id': task-2205146, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.236494} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 769.943598] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 769.943822] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 769.944029] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 769.944181] env[60788]: INFO nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Took 0.64 seconds to destroy the instance on the hypervisor. [ 769.946330] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-013cf182-2c43-4cc4-b3bd-6a83e515986b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 769.948719] env[60788]: DEBUG nova.compute.claims [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 769.948719] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 769.948844] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 769.974887] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 770.065717] env[60788]: DEBUG oslo_vmware.rw_handles [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 770.138021] env[60788]: DEBUG oslo_vmware.rw_handles [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 770.138797] env[60788]: DEBUG oslo_vmware.rw_handles [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 770.549112] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-993317e7-d483-4293-b9bf-e088117c3517 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.558519] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0745e4a3-1e92-4cb8-934d-d7fb5927e909 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.594522] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26b2d5b-2b04-4ba7-b078-6a3bb9295831 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.599962] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de81afc3-1ff3-4467-a193-7b0d9e29e7af {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 770.617231] env[60788]: DEBUG nova.compute.provider_tree [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 770.632499] env[60788]: DEBUG nova.scheduler.client.report [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 770.668896] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.718s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 770.668896] env[60788]: ERROR nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 770.668896] env[60788]: Faults: ['InvalidArgument'] [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Traceback (most recent call last): [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self.driver.spawn(context, instance, image_meta, [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self._vmops.spawn(context, instance, image_meta, injected_files, [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 770.668896] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self._fetch_image_if_missing(context, vi) [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] image_cache(vi, tmp_image_ds_loc) [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] vm_util.copy_virtual_disk( [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] session._wait_for_task(vmdk_copy_task) [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] return self.wait_for_task(task_ref) [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] return evt.wait() [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] result = hub.switch() [ 770.669454] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] return self.greenlet.switch() [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] self.f(*self.args, **self.kw) [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] raise exceptions.translate_fault(task_info.error) [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Faults: ['InvalidArgument'] [ 770.670186] env[60788]: ERROR nova.compute.manager [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] [ 770.670186] env[60788]: DEBUG nova.compute.utils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 770.671306] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Build of instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 was re-scheduled: A specified parameter was not correct: fileType [ 770.671306] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 770.671306] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 770.671691] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 770.671691] env[60788]: DEBUG nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 770.671889] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 770.755857] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 770.755857] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 771.749290] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 771.756905] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 771.756905] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 771.756995] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 771.782966] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783142] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783278] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783409] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783535] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783657] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783777] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.783896] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.784147] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 771.784350] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 771.884208] env[60788]: DEBUG nova.network.neutron [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 771.897427] env[60788]: INFO nova.compute.manager [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Took 1.23 seconds to deallocate network for instance. [ 772.098331] env[60788]: INFO nova.scheduler.client.report [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Deleted allocations for instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 [ 772.129385] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8fce5d00-f181-4ccc-bcde-eee703af4321 tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 249.300s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.130562] env[60788]: DEBUG oslo_concurrency.lockutils [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 49.861s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.130777] env[60788]: DEBUG oslo_concurrency.lockutils [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Acquiring lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.130978] env[60788]: DEBUG oslo_concurrency.lockutils [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.131162] env[60788]: DEBUG oslo_concurrency.lockutils [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.133871] env[60788]: INFO nova.compute.manager [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Terminating instance [ 772.138043] env[60788]: DEBUG nova.compute.manager [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 772.138043] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 772.138043] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-909ed832-93c2-46e8-8ead-9b885f13eed2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.150193] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a256301-5461-40f8-9a23-f852745f0dfb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.163768] env[60788]: DEBUG nova.compute.manager [None req-97ca0ff1-250a-439a-b9df-1b839dfc77f6 tempest-ServerMetadataTestJSON-2090774188 tempest-ServerMetadataTestJSON-2090774188-project-member] [instance: c7380613-f621-4c56-9e8f-6b4f8dfe3ef1] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 772.189194] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e883e763-d7c1-4eae-af6e-4a4e4a84e323 could not be found. [ 772.189194] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 772.189194] env[60788]: INFO nova.compute.manager [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Took 0.05 seconds to destroy the instance on the hypervisor. [ 772.189194] env[60788]: DEBUG oslo.service.loopingcall [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 772.189194] env[60788]: DEBUG nova.compute.manager [-] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 772.189454] env[60788]: DEBUG nova.network.neutron [-] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 772.217825] env[60788]: DEBUG nova.compute.manager [None req-97ca0ff1-250a-439a-b9df-1b839dfc77f6 tempest-ServerMetadataTestJSON-2090774188 tempest-ServerMetadataTestJSON-2090774188-project-member] [instance: c7380613-f621-4c56-9e8f-6b4f8dfe3ef1] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 772.247430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-97ca0ff1-250a-439a-b9df-1b839dfc77f6 tempest-ServerMetadataTestJSON-2090774188 tempest-ServerMetadataTestJSON-2090774188-project-member] Lock "c7380613-f621-4c56-9e8f-6b4f8dfe3ef1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.384s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.249127] env[60788]: DEBUG nova.network.neutron [-] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 772.258114] env[60788]: INFO nova.compute.manager [-] [instance: e883e763-d7c1-4eae-af6e-4a4e4a84e323] Took 0.07 seconds to deallocate network for instance. [ 772.270442] env[60788]: DEBUG nova.compute.manager [None req-5a034956-c054-4769-b4b2-ad45afad48e3 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] [instance: dc9706cc-1c7e-4570-8607-20120306153c] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 772.311345] env[60788]: DEBUG nova.compute.manager [None req-5a034956-c054-4769-b4b2-ad45afad48e3 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] [instance: dc9706cc-1c7e-4570-8607-20120306153c] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 772.342205] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5a034956-c054-4769-b4b2-ad45afad48e3 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Lock "dc9706cc-1c7e-4570-8607-20120306153c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.888s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.377315] env[60788]: DEBUG nova.compute.manager [None req-31e5baab-94ed-4e4c-a248-2649db17ae79 tempest-ServersNegativeTestJSON-1857535569 tempest-ServersNegativeTestJSON-1857535569-project-member] [instance: 9aec9e50-0470-43f7-98b2-3f2eac50e6bf] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 772.460842] env[60788]: DEBUG oslo_concurrency.lockutils [None req-282bf19e-dbaf-42f1-9275-20b241a6ad1b tempest-ServerExternalEventsTest-1695403239 tempest-ServerExternalEventsTest-1695403239-project-member] Lock "e883e763-d7c1-4eae-af6e-4a4e4a84e323" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.330s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.463321] env[60788]: DEBUG nova.compute.manager [None req-31e5baab-94ed-4e4c-a248-2649db17ae79 tempest-ServersNegativeTestJSON-1857535569 tempest-ServersNegativeTestJSON-1857535569-project-member] [instance: 9aec9e50-0470-43f7-98b2-3f2eac50e6bf] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 772.700024] env[60788]: DEBUG oslo_concurrency.lockutils [None req-31e5baab-94ed-4e4c-a248-2649db17ae79 tempest-ServersNegativeTestJSON-1857535569 tempest-ServersNegativeTestJSON-1857535569-project-member] Lock "9aec9e50-0470-43f7-98b2-3f2eac50e6bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.794s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.733932] env[60788]: DEBUG nova.compute.manager [None req-d180e74c-e402-4d07-8908-b0efc419db27 tempest-VolumesAssistedSnapshotsTest-1856435780 tempest-VolumesAssistedSnapshotsTest-1856435780-project-member] [instance: 929a6bc2-109d-4753-8b98-8155c0e4e839] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 772.753253] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 772.776284] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.776701] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.776701] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.776836] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 772.779075] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21019aba-56b5-4d56-9717-e2fcab2dc7d2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.788296] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e46bcef-fd23-4441-85cf-c539aedd653b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.807913] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f01b46f-9172-4476-ad06-5cfd795379a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.811887] env[60788]: DEBUG nova.compute.manager [None req-d180e74c-e402-4d07-8908-b0efc419db27 tempest-VolumesAssistedSnapshotsTest-1856435780 tempest-VolumesAssistedSnapshotsTest-1856435780-project-member] [instance: 929a6bc2-109d-4753-8b98-8155c0e4e839] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 772.821904] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2399fbe-c5f6-4bcc-8460-25389e038d0c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 772.860592] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181209MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 772.860592] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.860592] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 772.866796] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d180e74c-e402-4d07-8908-b0efc419db27 tempest-VolumesAssistedSnapshotsTest-1856435780 tempest-VolumesAssistedSnapshotsTest-1856435780-project-member] Lock "929a6bc2-109d-4753-8b98-8155c0e4e839" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.594s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 772.883390] env[60788]: DEBUG nova.compute.manager [None req-37167931-5b27-4a56-99c9-ba329e7688d4 tempest-ServersWithSpecificFlavorTestJSON-1896831385 tempest-ServersWithSpecificFlavorTestJSON-1896831385-project-member] [instance: ee5957af-d4c1-4c71-82ba-83c06eb08869] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 772.994674] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.994844] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb58b00e-1a78-4750-b912-48c94144ea66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.994972] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 331fc548-2076-48e2-a84b-94130a99c2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.995118] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.995258] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.996649] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.996868] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.997021] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 772.997154] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 773.019174] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.031823] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6f437d9f-f904-46ab-9dc6-9902c2ea4c71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.042878] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ef9990e1-e0a7-41c0-b738-de213fd7046a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.053084] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d159fbfa-f391-41f8-97ba-eb145eed26e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.062685] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 064e7e7c-eeca-4822-9d5a-148b9fbdc1f0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.071958] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4e86b919-f5f8-458c-a588-bd08bdcccf3b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.081552] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e991879c-de94-4e14-9480-95c95bcaaa05 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.090804] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.102017] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.112017] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.121535] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6a101160-4c4a-42a5-9dfa-e7f41aa9788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.130698] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 57c974dd-093c-44c1-ab08-1659e25bb392 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.142229] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d5972768-f55f-495b-a49f-43b00c4647c2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.151327] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 911b94d1-8c01-49fa-ae13-4565a028676e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 773.151568] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 773.151717] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 773.173488] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing inventories for resource provider 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 773.195174] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating ProviderTree inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 773.195174] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 773.211631] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing aggregate associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, aggregates: None {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 773.245070] env[60788]: DEBUG nova.compute.manager [None req-37167931-5b27-4a56-99c9-ba329e7688d4 tempest-ServersWithSpecificFlavorTestJSON-1896831385 tempest-ServersWithSpecificFlavorTestJSON-1896831385-project-member] [instance: ee5957af-d4c1-4c71-82ba-83c06eb08869] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 773.251122] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing trait associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 773.329181] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37167931-5b27-4a56-99c9-ba329e7688d4 tempest-ServersWithSpecificFlavorTestJSON-1896831385 tempest-ServersWithSpecificFlavorTestJSON-1896831385-project-member] Lock "ee5957af-d4c1-4c71-82ba-83c06eb08869" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.691s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.346744] env[60788]: DEBUG nova.compute.manager [None req-fafb2cac-a1c8-4ff5-94cd-15183f4ef5ac tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] [instance: 951453a7-a034-4111-9c5c-71d5c25245ff] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 773.378903] env[60788]: DEBUG nova.compute.manager [None req-fafb2cac-a1c8-4ff5-94cd-15183f4ef5ac tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] [instance: 951453a7-a034-4111-9c5c-71d5c25245ff] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 773.434595] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fafb2cac-a1c8-4ff5-94cd-15183f4ef5ac tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] Lock "951453a7-a034-4111-9c5c-71d5c25245ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.458s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.471329] env[60788]: DEBUG nova.compute.manager [None req-7d7e54e7-25eb-4724-9c63-dd7ca41cff5b tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] [instance: c27066c4-1fbb-4918-94c4-62a8bf1c2dda] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 773.576128] env[60788]: DEBUG nova.compute.manager [None req-7d7e54e7-25eb-4724-9c63-dd7ca41cff5b tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] [instance: c27066c4-1fbb-4918-94c4-62a8bf1c2dda] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 773.637248] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7d7e54e7-25eb-4724-9c63-dd7ca41cff5b tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Lock "c27066c4-1fbb-4918-94c4-62a8bf1c2dda" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.652s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.660446] env[60788]: DEBUG nova.compute.manager [None req-1b3f4b47-9993-4a1c-808c-7df0b4bca4b0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] [instance: a92343b0-aad3-4416-9305-1432b35ae1a8] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 773.701085] env[60788]: DEBUG nova.compute.manager [None req-1b3f4b47-9993-4a1c-808c-7df0b4bca4b0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] [instance: a92343b0-aad3-4416-9305-1432b35ae1a8] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 773.740753] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba02f3d7-ad4e-4760-9aac-04ecc0421992 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.751153] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b3b380-2ac9-4980-a0f2-5a452ffa099f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.787370] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a76c5fd6-2e23-40c2-b495-363c41049e39 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.796115] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c11321ad-ff6d-457a-8d78-9646e7390e81 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 773.804181] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1b3f4b47-9993-4a1c-808c-7df0b4bca4b0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Lock "a92343b0-aad3-4416-9305-1432b35ae1a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.153s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.817650] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 773.826792] env[60788]: DEBUG nova.compute.manager [None req-2bbfd857-58c7-4740-9290-f170bf8ad744 tempest-ServersTestFqdnHostnames-2032606859 tempest-ServersTestFqdnHostnames-2032606859-project-member] [instance: cd2ac191-ea52-43cd-a20b-87b963112818] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 773.831114] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 773.875578] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 773.876122] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.015s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.883175] env[60788]: DEBUG nova.compute.manager [None req-2bbfd857-58c7-4740-9290-f170bf8ad744 tempest-ServersTestFqdnHostnames-2032606859 tempest-ServersTestFqdnHostnames-2032606859-project-member] [instance: cd2ac191-ea52-43cd-a20b-87b963112818] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 773.914769] env[60788]: DEBUG oslo_concurrency.lockutils [None req-2bbfd857-58c7-4740-9290-f170bf8ad744 tempest-ServersTestFqdnHostnames-2032606859 tempest-ServersTestFqdnHostnames-2032606859-project-member] Lock "cd2ac191-ea52-43cd-a20b-87b963112818" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.207s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 773.948285] env[60788]: DEBUG nova.compute.manager [None req-d3e697bf-90c1-4b85-acd2-e8578cf21fca tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] [instance: 008c517b-5838-43a0-aad3-5c7436d00275] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 773.994379] env[60788]: DEBUG nova.compute.manager [None req-d3e697bf-90c1-4b85-acd2-e8578cf21fca tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] [instance: 008c517b-5838-43a0-aad3-5c7436d00275] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 774.035569] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d3e697bf-90c1-4b85-acd2-e8578cf21fca tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] Lock "008c517b-5838-43a0-aad3-5c7436d00275" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.261s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.050679] env[60788]: DEBUG nova.compute.manager [None req-8d9448e2-31bd-45f7-8631-b52590c958f0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] [instance: e4bfd8ff-c503-420a-8a89-34c652d9fb2c] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 774.084539] env[60788]: DEBUG nova.compute.manager [None req-8d9448e2-31bd-45f7-8631-b52590c958f0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] [instance: e4bfd8ff-c503-420a-8a89-34c652d9fb2c] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 774.120273] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8d9448e2-31bd-45f7-8631-b52590c958f0 tempest-ListServerFiltersTestJSON-4972523 tempest-ListServerFiltersTestJSON-4972523-project-member] Lock "e4bfd8ff-c503-420a-8a89-34c652d9fb2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.447s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.158377] env[60788]: DEBUG nova.compute.manager [None req-97c5b76c-1771-41ce-b915-246f2c2379e4 tempest-ServersTestJSON-162999821 tempest-ServersTestJSON-162999821-project-member] [instance: 1e6f4b96-7207-4208-9193-b0d207b1c703] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 774.189596] env[60788]: DEBUG nova.compute.manager [None req-97c5b76c-1771-41ce-b915-246f2c2379e4 tempest-ServersTestJSON-162999821 tempest-ServersTestJSON-162999821-project-member] [instance: 1e6f4b96-7207-4208-9193-b0d207b1c703] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 774.218867] env[60788]: DEBUG oslo_concurrency.lockutils [None req-97c5b76c-1771-41ce-b915-246f2c2379e4 tempest-ServersTestJSON-162999821 tempest-ServersTestJSON-162999821-project-member] Lock "1e6f4b96-7207-4208-9193-b0d207b1c703" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.539s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.252585] env[60788]: DEBUG nova.compute.manager [None req-a3ae2cfa-bddb-4d68-b3c4-cf196a6e2518 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] [instance: 5f6d54d4-6862-46b0-9558-5db0c5b392d1] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 774.290254] env[60788]: DEBUG nova.compute.manager [None req-a3ae2cfa-bddb-4d68-b3c4-cf196a6e2518 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] [instance: 5f6d54d4-6862-46b0-9558-5db0c5b392d1] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 774.342205] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a3ae2cfa-bddb-4d68-b3c4-cf196a6e2518 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] Lock "5f6d54d4-6862-46b0-9558-5db0c5b392d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.485s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.364698] env[60788]: DEBUG nova.compute.manager [None req-0bcc92c4-e4e6-47e7-a654-c7da1bc77663 tempest-InstanceActionsV221TestJSON-1379281465 tempest-InstanceActionsV221TestJSON-1379281465-project-member] [instance: 977ea808-4e2d-4388-a5af-93048b5754e3] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 774.438098] env[60788]: DEBUG nova.compute.manager [None req-0bcc92c4-e4e6-47e7-a654-c7da1bc77663 tempest-InstanceActionsV221TestJSON-1379281465 tempest-InstanceActionsV221TestJSON-1379281465-project-member] [instance: 977ea808-4e2d-4388-a5af-93048b5754e3] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 774.481947] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0bcc92c4-e4e6-47e7-a654-c7da1bc77663 tempest-InstanceActionsV221TestJSON-1379281465 tempest-InstanceActionsV221TestJSON-1379281465-project-member] Lock "977ea808-4e2d-4388-a5af-93048b5754e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.736s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 774.497160] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 774.610992] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 774.611336] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 774.615174] env[60788]: INFO nova.compute.claims [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 774.877163] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 774.877485] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 774.877538] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 774.877686] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 775.295597] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdec0760-6800-4c02-bc19-98b7788eafde {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.305545] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19060568-d84f-4c04-9d72-2a4f55b28050 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.339744] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1efd5177-a4cb-4ef4-845b-fbe952768af2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.352021] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e60e84d-c1bb-40e2-a71a-961fc9670076 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 775.366225] env[60788]: DEBUG nova.compute.provider_tree [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 775.379833] env[60788]: DEBUG nova.scheduler.client.report [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 775.763296] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.152s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 775.763821] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 775.785649] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "c206be99-2f74-4c28-a008-e6edcccf65bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 775.785998] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 776.013700] env[60788]: DEBUG nova.compute.utils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 776.017019] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 776.017019] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 776.062072] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 776.321516] env[60788]: DEBUG nova.policy [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3fd0f3818eda48409da3a8977e2d963b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2313b232c99a4a16a40e01fec91c13f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 776.357973] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 776.389236] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 776.389490] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 776.389642] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 776.389890] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 776.389954] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 776.390114] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 776.390331] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 776.390484] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 776.391319] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 776.391563] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 776.391753] env[60788]: DEBUG nova.virt.hardware [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 776.392664] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e2f92bf-48dd-44e3-9e3f-3e60aa35e78a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.404896] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fe3c6aa-be5d-4c67-a730-99d29e9dacc6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.474403] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "01821598-4692-440b-8128-c50e359386e2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 777.358584] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Successfully created port: 0cbc288a-59d8-43a2-94e4-d2f333e429cc {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 778.847430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fc197db7-f494-4fbe-8055-4da203d1be00 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] Acquiring lock "d136a94d-344a-4697-97b5-3d732a16f4a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.847430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fc197db7-f494-4fbe-8055-4da203d1be00 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] Lock "d136a94d-344a-4697-97b5-3d732a16f4a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 780.306305] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Successfully updated port: 0cbc288a-59d8-43a2-94e4-d2f333e429cc {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 780.323217] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 780.323359] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 780.323504] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 780.479845] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 781.408498] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Updating instance_info_cache with network_info: [{"id": "0cbc288a-59d8-43a2-94e4-d2f333e429cc", "address": "fa:16:3e:ce:20:d5", "network": {"id": "31e5241c-3d65-4624-bcf6-45322ce7ebd6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1793513747-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2313b232c99a4a16a40e01fec91c13f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbdab640-5fea-4254-8bd3-f855b7eaca0d", "external-id": "nsx-vlan-transportzone-615", "segmentation_id": 615, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cbc288a-59", "ovs_interfaceid": "0cbc288a-59d8-43a2-94e4-d2f333e429cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 781.434087] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 781.436037] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance network_info: |[{"id": "0cbc288a-59d8-43a2-94e4-d2f333e429cc", "address": "fa:16:3e:ce:20:d5", "network": {"id": "31e5241c-3d65-4624-bcf6-45322ce7ebd6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1793513747-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2313b232c99a4a16a40e01fec91c13f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbdab640-5fea-4254-8bd3-f855b7eaca0d", "external-id": "nsx-vlan-transportzone-615", "segmentation_id": 615, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cbc288a-59", "ovs_interfaceid": "0cbc288a-59d8-43a2-94e4-d2f333e429cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 781.436472] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:20:d5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbdab640-5fea-4254-8bd3-f855b7eaca0d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0cbc288a-59d8-43a2-94e4-d2f333e429cc', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 781.442994] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating folder: Project (2313b232c99a4a16a40e01fec91c13f2). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 781.446306] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d5fb1baa-977a-46ef-b901-9a31451f6e75 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.462435] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Created folder: Project (2313b232c99a4a16a40e01fec91c13f2) in parent group-v449747. [ 781.462755] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating folder: Instances. Parent ref: group-v449788. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 781.462978] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8dd76335-c7ef-478e-a13d-ab102baecb6d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.472917] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Created folder: Instances in parent group-v449788. [ 781.473188] env[60788]: DEBUG oslo.service.loopingcall [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 781.473380] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 01821598-4692-440b-8128-c50e359386e2] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 781.473612] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c0b15d54-8e50-4bd2-be1b-6b3199014367 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 781.497539] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 781.497539] env[60788]: value = "task-2205149" [ 781.497539] env[60788]: _type = "Task" [ 781.497539] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 781.505586] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205149, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 782.010228] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205149, 'name': CreateVM_Task, 'duration_secs': 0.354653} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 782.011213] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 01821598-4692-440b-8128-c50e359386e2] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 782.011312] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 782.011526] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 782.012907] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 782.012907] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-89f41d5d-2c5d-4b6b-9bee-0c2d0fc7a954 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.018260] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 782.018260] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52561de4-f0ee-c419-bfd1-a1b83ed36ff7" [ 782.018260] env[60788]: _type = "Task" [ 782.018260] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 782.026998] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52561de4-f0ee-c419-bfd1-a1b83ed36ff7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 782.402014] env[60788]: DEBUG nova.compute.manager [req-a0657bf1-193b-45ee-9156-282ec77d1bc3 req-5d63ce31-c95b-4d89-84b4-a5860ac22fb6 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Received event network-vif-plugged-0cbc288a-59d8-43a2-94e4-d2f333e429cc {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 782.402362] env[60788]: DEBUG oslo_concurrency.lockutils [req-a0657bf1-193b-45ee-9156-282ec77d1bc3 req-5d63ce31-c95b-4d89-84b4-a5860ac22fb6 service nova] Acquiring lock "01821598-4692-440b-8128-c50e359386e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.402658] env[60788]: DEBUG oslo_concurrency.lockutils [req-a0657bf1-193b-45ee-9156-282ec77d1bc3 req-5d63ce31-c95b-4d89-84b4-a5860ac22fb6 service nova] Lock "01821598-4692-440b-8128-c50e359386e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 782.402851] env[60788]: DEBUG oslo_concurrency.lockutils [req-a0657bf1-193b-45ee-9156-282ec77d1bc3 req-5d63ce31-c95b-4d89-84b4-a5860ac22fb6 service nova] Lock "01821598-4692-440b-8128-c50e359386e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 782.403075] env[60788]: DEBUG nova.compute.manager [req-a0657bf1-193b-45ee-9156-282ec77d1bc3 req-5d63ce31-c95b-4d89-84b4-a5860ac22fb6 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] No waiting events found dispatching network-vif-plugged-0cbc288a-59d8-43a2-94e4-d2f333e429cc {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 782.403298] env[60788]: WARNING nova.compute.manager [req-a0657bf1-193b-45ee-9156-282ec77d1bc3 req-5d63ce31-c95b-4d89-84b4-a5860ac22fb6 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Received unexpected event network-vif-plugged-0cbc288a-59d8-43a2-94e4-d2f333e429cc for instance with vm_state building and task_state deleting. [ 782.542608] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 782.542608] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 782.542608] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 783.221906] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Acquiring lock "1ae1eb4b-4696-4592-a758-79b2211d35c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.222592] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Lock "1ae1eb4b-4696-4592-a758-79b2211d35c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 783.255221] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Acquiring lock "4bfbddb3-f66c-4059-8624-654e180ab997" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.255572] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Lock "4bfbddb3-f66c-4059-8624-654e180ab997" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 783.291600] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Acquiring lock "5ebb3604-792d-4fd7-95e6-d8a826c2d50a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.291857] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Lock "5ebb3604-792d-4fd7-95e6-d8a826c2d50a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 785.744826] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5e854664-c4b9-4f68-b7b1-cb373930a57f tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "b7b0591b-123b-49ad-8ab6-6881d0c7888b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 785.745122] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5e854664-c4b9-4f68-b7b1-cb373930a57f tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "b7b0591b-123b-49ad-8ab6-6881d0c7888b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 785.768848] env[60788]: DEBUG nova.compute.manager [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Received event network-changed-0cbc288a-59d8-43a2-94e4-d2f333e429cc {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 785.769068] env[60788]: DEBUG nova.compute.manager [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Refreshing instance network info cache due to event network-changed-0cbc288a-59d8-43a2-94e4-d2f333e429cc. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 785.769287] env[60788]: DEBUG oslo_concurrency.lockutils [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] Acquiring lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 785.769451] env[60788]: DEBUG oslo_concurrency.lockutils [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] Acquired lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 785.769655] env[60788]: DEBUG nova.network.neutron [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Refreshing network info cache for port 0cbc288a-59d8-43a2-94e4-d2f333e429cc {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 786.767188] env[60788]: DEBUG nova.network.neutron [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Updated VIF entry in instance network info cache for port 0cbc288a-59d8-43a2-94e4-d2f333e429cc. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 786.768237] env[60788]: DEBUG nova.network.neutron [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] [instance: 01821598-4692-440b-8128-c50e359386e2] Updating instance_info_cache with network_info: [{"id": "0cbc288a-59d8-43a2-94e4-d2f333e429cc", "address": "fa:16:3e:ce:20:d5", "network": {"id": "31e5241c-3d65-4624-bcf6-45322ce7ebd6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1793513747-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2313b232c99a4a16a40e01fec91c13f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbdab640-5fea-4254-8bd3-f855b7eaca0d", "external-id": "nsx-vlan-transportzone-615", "segmentation_id": 615, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cbc288a-59", "ovs_interfaceid": "0cbc288a-59d8-43a2-94e4-d2f333e429cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 786.782376] env[60788]: DEBUG oslo_concurrency.lockutils [req-a0b57aa8-8dc5-41ef-a1f3-90609c4f8211 req-aa61e436-1f51-4213-ae25-28d8a4b29585 service nova] Releasing lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 787.551312] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e9757ac4-1971-433c-94b6-35ae5be9a558 tempest-ServerTagsTestJSON-1708565971 tempest-ServerTagsTestJSON-1708565971-project-member] Acquiring lock "71ac0cb5-ebac-4f22-897c-1742b5416fca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 787.551566] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e9757ac4-1971-433c-94b6-35ae5be9a558 tempest-ServerTagsTestJSON-1708565971 tempest-ServerTagsTestJSON-1708565971-project-member] Lock "71ac0cb5-ebac-4f22-897c-1742b5416fca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 787.580078] env[60788]: DEBUG oslo_concurrency.lockutils [None req-463b86de-7bbc-4955-a233-560623237ed5 tempest-ImagesOneServerNegativeTestJSON-1867104132 tempest-ImagesOneServerNegativeTestJSON-1867104132-project-member] Acquiring lock "55ccb77a-7c54-4e4f-a665-43dc1c30e595" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 787.580078] env[60788]: DEBUG oslo_concurrency.lockutils [None req-463b86de-7bbc-4955-a233-560623237ed5 tempest-ImagesOneServerNegativeTestJSON-1867104132 tempest-ImagesOneServerNegativeTestJSON-1867104132-project-member] Lock "55ccb77a-7c54-4e4f-a665-43dc1c30e595" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.109697] env[60788]: DEBUG oslo_concurrency.lockutils [None req-91c99aef-65a8-4b70-b938-eeab8a6850ca tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] Acquiring lock "71acb134-101c-482b-9e5f-bbc18b8e01d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.109697] env[60788]: DEBUG oslo_concurrency.lockutils [None req-91c99aef-65a8-4b70-b938-eeab8a6850ca tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] Lock "71acb134-101c-482b-9e5f-bbc18b8e01d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.808691] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3f263ee7-2870-4d1f-8d87-c94368b52fe5 tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] Acquiring lock "9da7df70-e116-4bf9-83fb-626208162b27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.808928] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3f263ee7-2870-4d1f-8d87-c94368b52fe5 tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] Lock "9da7df70-e116-4bf9-83fb-626208162b27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 794.200273] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8208d9af-1af9-4e0a-ac46-6b181f238d79 tempest-ImagesNegativeTestJSON-1147299362 tempest-ImagesNegativeTestJSON-1147299362-project-member] Acquiring lock "f39ca342-04ed-45d7-8017-717d3a9ba244" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 794.200635] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8208d9af-1af9-4e0a-ac46-6b181f238d79 tempest-ImagesNegativeTestJSON-1147299362 tempest-ImagesNegativeTestJSON-1147299362-project-member] Lock "f39ca342-04ed-45d7-8017-717d3a9ba244" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 800.205999] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c58645f-7047-4da1-a24f-db229131d350 tempest-ServerShowV254Test-617035683 tempest-ServerShowV254Test-617035683-project-member] Acquiring lock "b4004d4f-8a7f-42be-9ce4-5ab53ae62f78" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 800.206350] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c58645f-7047-4da1-a24f-db229131d350 tempest-ServerShowV254Test-617035683 tempest-ServerShowV254Test-617035683-project-member] Lock "b4004d4f-8a7f-42be-9ce4-5ab53ae62f78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 805.585904] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3028fffc-75c6-4bda-a932-62061e55af94 tempest-ServerActionsV293TestJSON-413592385 tempest-ServerActionsV293TestJSON-413592385-project-member] Acquiring lock "822c4411-1759-4d9e-820a-5d617fdd2488" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.586233] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3028fffc-75c6-4bda-a932-62061e55af94 tempest-ServerActionsV293TestJSON-413592385 tempest-ServerActionsV293TestJSON-413592385-project-member] Lock "822c4411-1759-4d9e-820a-5d617fdd2488" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 807.869181] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6cbf59b-5cee-4a47-882a-f105b92835ec tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "61d9ecf8-0ed5-4451-9953-e53cabecf36b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 807.869454] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6cbf59b-5cee-4a47-882a-f105b92835ec tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "61d9ecf8-0ed5-4451-9953-e53cabecf36b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.710348] env[60788]: WARNING oslo_vmware.rw_handles [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 816.710348] env[60788]: ERROR oslo_vmware.rw_handles [ 816.710933] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 816.712555] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 816.712818] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Copying Virtual Disk [datastore2] vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/c8f45c31-17e8-4554-ae3d-ac9819dad8b7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 816.713141] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-00755d87-3df9-4fcb-9fa9-5481ce08aff7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 816.722543] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Waiting for the task: (returnval){ [ 816.722543] env[60788]: value = "task-2205161" [ 816.722543] env[60788]: _type = "Task" [ 816.722543] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 816.732724] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Task: {'id': task-2205161, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.233041] env[60788]: DEBUG oslo_vmware.exceptions [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 817.233368] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 817.233941] env[60788]: ERROR nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 817.233941] env[60788]: Faults: ['InvalidArgument'] [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Traceback (most recent call last): [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] yield resources [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self.driver.spawn(context, instance, image_meta, [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self._fetch_image_if_missing(context, vi) [ 817.233941] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] image_cache(vi, tmp_image_ds_loc) [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] vm_util.copy_virtual_disk( [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] session._wait_for_task(vmdk_copy_task) [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] return self.wait_for_task(task_ref) [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] return evt.wait() [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] result = hub.switch() [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 817.234329] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] return self.greenlet.switch() [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self.f(*self.args, **self.kw) [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] raise exceptions.translate_fault(task_info.error) [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Faults: ['InvalidArgument'] [ 817.234767] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] [ 817.234767] env[60788]: INFO nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Terminating instance [ 817.235802] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 817.236016] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 817.237428] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 817.237637] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 817.237870] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7f9ab424-7026-4fd4-bec0-d9b317208b09 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.240184] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ded4b63-dcc6-4b15-a1b1-67f62332f995 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.247387] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 817.247614] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-909135be-bb31-4f41-a0c4-423eaca97aa2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.249843] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 817.250023] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 817.250965] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3417351-bcd2-4930-9752-ec10030a1d33 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.256165] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Waiting for the task: (returnval){ [ 817.256165] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52e7605a-775b-69a1-5a03-e33e7bd86cc1" [ 817.256165] env[60788]: _type = "Task" [ 817.256165] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.263810] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52e7605a-775b-69a1-5a03-e33e7bd86cc1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.321458] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 817.321759] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 817.322170] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Deleting the datastore file [datastore2] 231fcc6a-7ec4-4202-b960-ddc966ef2b9c {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 817.322411] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b6bcb4b0-60fa-4fca-bbdf-b68b12550b66 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.329565] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Waiting for the task: (returnval){ [ 817.329565] env[60788]: value = "task-2205163" [ 817.329565] env[60788]: _type = "Task" [ 817.329565] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 817.338787] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Task: {'id': task-2205163, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 817.767106] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 817.767370] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Creating directory with path [datastore2] vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 817.767606] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e7adde7b-2932-49bd-9497-075b5e1a5384 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.780171] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Created directory with path [datastore2] vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 817.780412] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Fetch image to [datastore2] vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 817.780646] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 817.781358] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-326b3123-ce34-4d24-ab0f-c3e65b0421d7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.788668] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11fa18ac-baa7-45a1-a3b9-b3da2ad286ce {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.798274] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-661d7baa-7d2c-42ca-bd1c-eedffb338ab7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.834039] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0309e4-70bc-47f4-8ea3-4db59ed6e9a1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.844907] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ea1b1382-0884-435e-9c05-6c068d15abd5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 817.846741] env[60788]: DEBUG oslo_vmware.api [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Task: {'id': task-2205163, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084393} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 817.846982] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 817.847173] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 817.847339] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 817.847509] env[60788]: INFO nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 817.849916] env[60788]: DEBUG nova.compute.claims [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 817.850101] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.850317] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.873228] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 817.925383] env[60788]: DEBUG oslo_vmware.rw_handles [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 817.984971] env[60788]: DEBUG oslo_vmware.rw_handles [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 817.985185] env[60788]: DEBUG oslo_vmware.rw_handles [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 818.283185] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e34ba687-7f38-4873-a96b-cffa73210399 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.291198] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5156aafb-260d-4f05-9e5e-2a069be14151 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.320422] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c841156-c12d-4ede-b8dd-0661732cc880 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.327475] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c9b1f92-5739-45d7-ac28-3cb86ba8ff7b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.340785] env[60788]: DEBUG nova.compute.provider_tree [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 818.350023] env[60788]: DEBUG nova.scheduler.client.report [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 818.366017] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.516s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.366569] env[60788]: ERROR nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 818.366569] env[60788]: Faults: ['InvalidArgument'] [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Traceback (most recent call last): [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self.driver.spawn(context, instance, image_meta, [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self._fetch_image_if_missing(context, vi) [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] image_cache(vi, tmp_image_ds_loc) [ 818.366569] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] vm_util.copy_virtual_disk( [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] session._wait_for_task(vmdk_copy_task) [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] return self.wait_for_task(task_ref) [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] return evt.wait() [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] result = hub.switch() [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] return self.greenlet.switch() [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 818.367020] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] self.f(*self.args, **self.kw) [ 818.367411] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 818.367411] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] raise exceptions.translate_fault(task_info.error) [ 818.367411] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 818.367411] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Faults: ['InvalidArgument'] [ 818.367411] env[60788]: ERROR nova.compute.manager [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] [ 818.367411] env[60788]: DEBUG nova.compute.utils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 818.368971] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Build of instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c was re-scheduled: A specified parameter was not correct: fileType [ 818.368971] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 818.369124] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 818.369196] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 818.369352] env[60788]: DEBUG nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 818.369511] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 818.882427] env[60788]: DEBUG nova.network.neutron [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.898683] env[60788]: INFO nova.compute.manager [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Took 0.53 seconds to deallocate network for instance. [ 819.010904] env[60788]: INFO nova.scheduler.client.report [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Deleted allocations for instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c [ 819.030139] env[60788]: DEBUG oslo_concurrency.lockutils [None req-06a1fa91-582a-4ca7-965d-eeb874a1565c tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 294.500s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.031347] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 94.938s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 819.031569] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Acquiring lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 819.031772] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 819.031937] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.034573] env[60788]: INFO nova.compute.manager [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Terminating instance [ 819.037569] env[60788]: DEBUG nova.compute.manager [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 819.037650] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 819.038053] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ec55c263-6521-4830-a515-b51c8c7d979f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.049648] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ddb41b-1c0d-46b8-b245-992b8411a664 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.060706] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 6f437d9f-f904-46ab-9dc6-9902c2ea4c71] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.083246] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 231fcc6a-7ec4-4202-b960-ddc966ef2b9c could not be found. [ 819.083518] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 819.083731] env[60788]: INFO nova.compute.manager [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 819.083992] env[60788]: DEBUG oslo.service.loopingcall [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 819.084243] env[60788]: DEBUG nova.compute.manager [-] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 819.084344] env[60788]: DEBUG nova.network.neutron [-] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 819.087047] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 6f437d9f-f904-46ab-9dc6-9902c2ea4c71] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.120375] env[60788]: DEBUG nova.network.neutron [-] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 819.124448] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "6f437d9f-f904-46ab-9dc6-9902c2ea4c71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.753s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.127035] env[60788]: INFO nova.compute.manager [-] [instance: 231fcc6a-7ec4-4202-b960-ddc966ef2b9c] Took 0.04 seconds to deallocate network for instance. [ 819.135145] env[60788]: DEBUG nova.compute.manager [None req-db9e33ef-b90c-4212-88a2-d697728b61e6 tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] [instance: ef9990e1-e0a7-41c0-b738-de213fd7046a] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.158962] env[60788]: DEBUG nova.compute.manager [None req-db9e33ef-b90c-4212-88a2-d697728b61e6 tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] [instance: ef9990e1-e0a7-41c0-b738-de213fd7046a] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.180460] env[60788]: DEBUG oslo_concurrency.lockutils [None req-db9e33ef-b90c-4212-88a2-d697728b61e6 tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] Lock "ef9990e1-e0a7-41c0-b738-de213fd7046a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.184s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.194423] env[60788]: DEBUG nova.compute.manager [None req-c9d3fce2-9ee6-4e64-9b70-e3d9c0afcea4 tempest-FloatingIPsAssociationTestJSON-421040393 tempest-FloatingIPsAssociationTestJSON-421040393-project-member] [instance: d159fbfa-f391-41f8-97ba-eb145eed26e7] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.229400] env[60788]: DEBUG nova.compute.manager [None req-c9d3fce2-9ee6-4e64-9b70-e3d9c0afcea4 tempest-FloatingIPsAssociationTestJSON-421040393 tempest-FloatingIPsAssociationTestJSON-421040393-project-member] [instance: d159fbfa-f391-41f8-97ba-eb145eed26e7] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.235270] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e3ba8beb-d75e-40fe-9132-9ba5871bf271 tempest-ServerDiagnosticsTest-2116693796 tempest-ServerDiagnosticsTest-2116693796-project-member] Lock "231fcc6a-7ec4-4202-b960-ddc966ef2b9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.254306] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c9d3fce2-9ee6-4e64-9b70-e3d9c0afcea4 tempest-FloatingIPsAssociationTestJSON-421040393 tempest-FloatingIPsAssociationTestJSON-421040393-project-member] Lock "d159fbfa-f391-41f8-97ba-eb145eed26e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.286s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.267597] env[60788]: DEBUG nova.compute.manager [None req-3d0c9844-ecd7-470c-91a3-7db11090c13a tempest-ServerGroupTestJSON-1121452212 tempest-ServerGroupTestJSON-1121452212-project-member] [instance: 064e7e7c-eeca-4822-9d5a-148b9fbdc1f0] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.294752] env[60788]: DEBUG nova.compute.manager [None req-3d0c9844-ecd7-470c-91a3-7db11090c13a tempest-ServerGroupTestJSON-1121452212 tempest-ServerGroupTestJSON-1121452212-project-member] [instance: 064e7e7c-eeca-4822-9d5a-148b9fbdc1f0] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.318648] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3d0c9844-ecd7-470c-91a3-7db11090c13a tempest-ServerGroupTestJSON-1121452212 tempest-ServerGroupTestJSON-1121452212-project-member] Lock "064e7e7c-eeca-4822-9d5a-148b9fbdc1f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.350s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.331574] env[60788]: DEBUG nova.compute.manager [None req-8579c047-2ba7-460f-8a0e-e547b1b073fa tempest-AttachInterfacesV270Test-1164081489 tempest-AttachInterfacesV270Test-1164081489-project-member] [instance: 4e86b919-f5f8-458c-a588-bd08bdcccf3b] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.366909] env[60788]: DEBUG nova.compute.manager [None req-8579c047-2ba7-460f-8a0e-e547b1b073fa tempest-AttachInterfacesV270Test-1164081489 tempest-AttachInterfacesV270Test-1164081489-project-member] [instance: 4e86b919-f5f8-458c-a588-bd08bdcccf3b] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.419284] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8579c047-2ba7-460f-8a0e-e547b1b073fa tempest-AttachInterfacesV270Test-1164081489 tempest-AttachInterfacesV270Test-1164081489-project-member] Lock "4e86b919-f5f8-458c-a588-bd08bdcccf3b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.734s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.433868] env[60788]: DEBUG nova.compute.manager [None req-125c2602-d133-402f-a81c-ca494c37f0b9 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e991879c-de94-4e14-9480-95c95bcaaa05] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.466478] env[60788]: DEBUG nova.compute.manager [None req-125c2602-d133-402f-a81c-ca494c37f0b9 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e991879c-de94-4e14-9480-95c95bcaaa05] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 819.499917] env[60788]: DEBUG oslo_concurrency.lockutils [None req-125c2602-d133-402f-a81c-ca494c37f0b9 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e991879c-de94-4e14-9480-95c95bcaaa05" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.817s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 819.511301] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 819.572175] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 819.572175] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 819.573734] env[60788]: INFO nova.compute.claims [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 820.001045] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d767088d-570c-41db-a3fe-4fd5bbf5766a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.009213] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1753718-977a-4482-acaa-80459cc2c35c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.040772] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65546146-113d-4149-81b1-4ceabcd86ac5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.049141] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25aa2cd3-b643-4738-9d47-35243e69fa2a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.063070] env[60788]: DEBUG nova.compute.provider_tree [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 820.071830] env[60788]: DEBUG nova.scheduler.client.report [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 820.089895] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.518s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 820.090652] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 820.127117] env[60788]: DEBUG nova.compute.utils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 820.131057] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 820.131057] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 820.137673] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 820.210035] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 820.229951] env[60788]: DEBUG nova.policy [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea622ae4a8c4429a924b9bce6e7a4170', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7263977fa40b402b8773fad106a23783', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 820.240082] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 820.240351] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 820.240518] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 820.240710] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 820.240848] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 820.240994] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 820.241222] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 820.241382] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 820.241550] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 820.241711] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 820.241879] env[60788]: DEBUG nova.virt.hardware [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 820.242796] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5072463-bbd3-4d7f-b12d-c30b654de59a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.251599] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d44462d7-26f0-409c-938f-a4220353fb0e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 820.783242] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Successfully created port: 8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 822.117205] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Successfully updated port: 8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 822.130434] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "refresh_cache-fe6168fd-528f-4acb-a44c-6d0b69cada6e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 822.130694] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquired lock "refresh_cache-fe6168fd-528f-4acb-a44c-6d0b69cada6e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 822.130866] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 822.263708] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 822.533523] env[60788]: DEBUG nova.compute.manager [req-6c81b5f9-e446-4437-9ea7-f5a40d343e68 req-ea9093c5-9646-4921-a7c7-331eafbb3b41 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Received event network-vif-plugged-8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 822.533789] env[60788]: DEBUG oslo_concurrency.lockutils [req-6c81b5f9-e446-4437-9ea7-f5a40d343e68 req-ea9093c5-9646-4921-a7c7-331eafbb3b41 service nova] Acquiring lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.534091] env[60788]: DEBUG oslo_concurrency.lockutils [req-6c81b5f9-e446-4437-9ea7-f5a40d343e68 req-ea9093c5-9646-4921-a7c7-331eafbb3b41 service nova] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.534453] env[60788]: DEBUG oslo_concurrency.lockutils [req-6c81b5f9-e446-4437-9ea7-f5a40d343e68 req-ea9093c5-9646-4921-a7c7-331eafbb3b41 service nova] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 822.534660] env[60788]: DEBUG nova.compute.manager [req-6c81b5f9-e446-4437-9ea7-f5a40d343e68 req-ea9093c5-9646-4921-a7c7-331eafbb3b41 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] No waiting events found dispatching network-vif-plugged-8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 822.534891] env[60788]: WARNING nova.compute.manager [req-6c81b5f9-e446-4437-9ea7-f5a40d343e68 req-ea9093c5-9646-4921-a7c7-331eafbb3b41 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Received unexpected event network-vif-plugged-8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 for instance with vm_state building and task_state spawning. [ 822.652272] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Updating instance_info_cache with network_info: [{"id": "8df2cf68-201b-4b34-90aa-a5c34ac9bdf3", "address": "fa:16:3e:78:64:38", "network": {"id": "151007b8-60b4-4697-8081-d738b16e3b5b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1100336604-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7263977fa40b402b8773fad106a23783", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b5291d0-ee0f-4d70-b2ae-ab6879a67b08", "external-id": "nsx-vlan-transportzone-597", "segmentation_id": 597, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8df2cf68-20", "ovs_interfaceid": "8df2cf68-201b-4b34-90aa-a5c34ac9bdf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 822.669562] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Releasing lock "refresh_cache-fe6168fd-528f-4acb-a44c-6d0b69cada6e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 822.669849] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance network_info: |[{"id": "8df2cf68-201b-4b34-90aa-a5c34ac9bdf3", "address": "fa:16:3e:78:64:38", "network": {"id": "151007b8-60b4-4697-8081-d738b16e3b5b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1100336604-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7263977fa40b402b8773fad106a23783", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b5291d0-ee0f-4d70-b2ae-ab6879a67b08", "external-id": "nsx-vlan-transportzone-597", "segmentation_id": 597, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8df2cf68-20", "ovs_interfaceid": "8df2cf68-201b-4b34-90aa-a5c34ac9bdf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 822.670353] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:64:38', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b5291d0-ee0f-4d70-b2ae-ab6879a67b08', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8df2cf68-201b-4b34-90aa-a5c34ac9bdf3', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 822.683496] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Creating folder: Project (7263977fa40b402b8773fad106a23783). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 822.684045] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-51190a7d-3c9e-48fc-ab88-a31b087b242e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.698239] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Created folder: Project (7263977fa40b402b8773fad106a23783) in parent group-v449747. [ 822.698515] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Creating folder: Instances. Parent ref: group-v449795. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 822.698820] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e502be28-a521-4a6e-915b-7f590fdfa22c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.709553] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Created folder: Instances in parent group-v449795. [ 822.709795] env[60788]: DEBUG oslo.service.loopingcall [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 822.709985] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 822.710208] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-15486eef-b0f8-4d61-bb23-3e1bfc02c8f8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 822.732446] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 822.732446] env[60788]: value = "task-2205166" [ 822.732446] env[60788]: _type = "Task" [ 822.732446] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 822.740968] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205166, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 823.244172] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205166, 'name': CreateVM_Task, 'duration_secs': 0.340489} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 823.244507] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 823.244976] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 823.245165] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 823.245473] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 823.245727] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9c55f31c-3cf2-4c21-8195-5c4d324f17bd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 823.251152] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Waiting for the task: (returnval){ [ 823.251152] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52026e0f-55f4-dac2-7e70-68f2327d2633" [ 823.251152] env[60788]: _type = "Task" [ 823.251152] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 823.259443] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52026e0f-55f4-dac2-7e70-68f2327d2633, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 823.566053] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 823.566405] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 823.761422] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 823.761750] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 823.762042] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 824.552641] env[60788]: DEBUG nova.compute.manager [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Received event network-changed-8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 824.552977] env[60788]: DEBUG nova.compute.manager [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Refreshing instance network info cache due to event network-changed-8df2cf68-201b-4b34-90aa-a5c34ac9bdf3. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 824.553136] env[60788]: DEBUG oslo_concurrency.lockutils [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] Acquiring lock "refresh_cache-fe6168fd-528f-4acb-a44c-6d0b69cada6e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 824.553215] env[60788]: DEBUG oslo_concurrency.lockutils [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] Acquired lock "refresh_cache-fe6168fd-528f-4acb-a44c-6d0b69cada6e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 824.553357] env[60788]: DEBUG nova.network.neutron [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Refreshing network info cache for port 8df2cf68-201b-4b34-90aa-a5c34ac9bdf3 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 824.988270] env[60788]: DEBUG nova.network.neutron [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Updated VIF entry in instance network info cache for port 8df2cf68-201b-4b34-90aa-a5c34ac9bdf3. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 824.988630] env[60788]: DEBUG nova.network.neutron [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Updating instance_info_cache with network_info: [{"id": "8df2cf68-201b-4b34-90aa-a5c34ac9bdf3", "address": "fa:16:3e:78:64:38", "network": {"id": "151007b8-60b4-4697-8081-d738b16e3b5b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1100336604-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7263977fa40b402b8773fad106a23783", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b5291d0-ee0f-4d70-b2ae-ab6879a67b08", "external-id": "nsx-vlan-transportzone-597", "segmentation_id": 597, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8df2cf68-20", "ovs_interfaceid": "8df2cf68-201b-4b34-90aa-a5c34ac9bdf3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 824.998317] env[60788]: DEBUG oslo_concurrency.lockutils [req-607cb94f-3223-440e-bad5-33e9b477df90 req-6749bde0-6c25-4c5a-a4bd-289a9af1b585 service nova] Releasing lock "refresh_cache-fe6168fd-528f-4acb-a44c-6d0b69cada6e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 830.754104] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 831.749561] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 831.753228] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 831.753391] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 832.750633] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 832.772716] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 832.773097] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 832.773335] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 832.795167] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795167] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795167] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795167] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795167] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795413] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795413] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795638] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.795893] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.796226] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 832.796466] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 832.797013] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 832.807390] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.807824] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.808135] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 832.808413] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 832.810011] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccd19fdf-43e8-42f8-af63-57ebba2cee6f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.818714] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c6f9e86-4688-472e-ae85-d75b6a4c0f2e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.835259] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08633e8c-e1f8-4de2-bb99-44ee92548543 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.842013] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f9d50bc-0fcf-45a8-934b-a7ecda84339f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 832.873837] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181183MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 832.873983] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 832.874201] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 832.956268] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb58b00e-1a78-4750-b912-48c94144ea66 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.956437] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 331fc548-2076-48e2-a84b-94130a99c2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.956793] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.956793] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.956793] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.956940] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.957038] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.957161] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.957277] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.957392] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 832.969314] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 832.980572] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 832.991061] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6a101160-4c4a-42a5-9dfa-e7f41aa9788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.001318] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 57c974dd-093c-44c1-ab08-1659e25bb392 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.015261] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d5972768-f55f-495b-a49f-43b00c4647c2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.025251] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 911b94d1-8c01-49fa-ae13-4565a028676e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.037221] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.043800] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d136a94d-344a-4697-97b5-3d732a16f4a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.054268] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1ae1eb4b-4696-4592-a758-79b2211d35c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.063693] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4bfbddb3-f66c-4059-8624-654e180ab997 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.073914] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5ebb3604-792d-4fd7-95e6-d8a826c2d50a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.083328] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b7b0591b-123b-49ad-8ab6-6881d0c7888b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.093092] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 71ac0cb5-ebac-4f22-897c-1742b5416fca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.102661] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 55ccb77a-7c54-4e4f-a665-43dc1c30e595 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.111721] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 71acb134-101c-482b-9e5f-bbc18b8e01d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.122009] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 9da7df70-e116-4bf9-83fb-626208162b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.132177] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f39ca342-04ed-45d7-8017-717d3a9ba244 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.162703] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b4004d4f-8a7f-42be-9ce4-5ab53ae62f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.173240] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 822c4411-1759-4d9e-820a-5d617fdd2488 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.183991] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 61d9ecf8-0ed5-4451-9953-e53cabecf36b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.193380] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 833.193621] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 833.193770] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 833.525247] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31ab4c96-cd2a-4211-bad6-c4e02e3ad38c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.533188] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e90dded-e0e2-4108-9c5d-94208025ee59 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.564092] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d028592-4700-40c7-b324-abf255a2cda2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.572376] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d43678c-a1f0-42a4-8a8a-3cf0d998517f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 833.586206] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 833.594476] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 833.608305] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 833.608498] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.734s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 834.565147] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 834.753345] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 834.753573] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 835.754287] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 839.644503] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 867.268445] env[60788]: WARNING oslo_vmware.rw_handles [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 867.268445] env[60788]: ERROR oslo_vmware.rw_handles [ 867.269032] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 867.271476] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 867.271796] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Copying Virtual Disk [datastore2] vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/316affcd-62ed-400f-acd8-6cf19081ac01/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 867.272165] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b40d981d-8357-4b47-9aca-a3c9faadf565 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.280943] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Waiting for the task: (returnval){ [ 867.280943] env[60788]: value = "task-2205167" [ 867.280943] env[60788]: _type = "Task" [ 867.280943] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.289836] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Task: {'id': task-2205167, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 867.791404] env[60788]: DEBUG oslo_vmware.exceptions [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 867.791758] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 867.792318] env[60788]: ERROR nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 867.792318] env[60788]: Faults: ['InvalidArgument'] [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Traceback (most recent call last): [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] yield resources [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self.driver.spawn(context, instance, image_meta, [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self._fetch_image_if_missing(context, vi) [ 867.792318] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] image_cache(vi, tmp_image_ds_loc) [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] vm_util.copy_virtual_disk( [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] session._wait_for_task(vmdk_copy_task) [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] return self.wait_for_task(task_ref) [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] return evt.wait() [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] result = hub.switch() [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 867.792762] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] return self.greenlet.switch() [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self.f(*self.args, **self.kw) [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] raise exceptions.translate_fault(task_info.error) [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Faults: ['InvalidArgument'] [ 867.793158] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] [ 867.793158] env[60788]: INFO nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Terminating instance [ 867.794234] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 867.794437] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 867.794689] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-60595f24-9cb2-44bc-9d8a-aa88f3f143fa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.797088] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 867.797393] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 867.799177] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-787901e5-da06-48a6-86dd-63b42596e9db {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.804739] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 867.805009] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89550910-b519-4d32-8a2c-16d08c8ceb2c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.807283] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 867.807457] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 867.808438] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a3237f3d-bddb-4364-8a51-4916b3b86b4c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.813406] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Waiting for the task: (returnval){ [ 867.813406] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]527258ab-af97-b0c5-f1da-0be712ca4ed2" [ 867.813406] env[60788]: _type = "Task" [ 867.813406] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.820454] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]527258ab-af97-b0c5-f1da-0be712ca4ed2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 867.877688] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 867.877915] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 867.878106] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Deleting the datastore file [datastore2] fb58b00e-1a78-4750-b912-48c94144ea66 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 867.878389] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc352d58-de14-4f89-844a-0eba7be72ebf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 867.884717] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Waiting for the task: (returnval){ [ 867.884717] env[60788]: value = "task-2205169" [ 867.884717] env[60788]: _type = "Task" [ 867.884717] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 867.892413] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Task: {'id': task-2205169, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 868.323574] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 868.323850] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Creating directory with path [datastore2] vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 868.324095] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d5556b3b-fc84-476e-ab0f-919d33aac5b2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.335399] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Created directory with path [datastore2] vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 868.335583] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Fetch image to [datastore2] vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 868.335749] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 868.336655] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21bd213b-10ab-4f12-b629-e56dd9fd4d16 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.343473] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0059e66e-c87d-4479-811e-69f84a079e46 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.352900] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60cc9e47-86fe-4dfe-a412-df6cc5804e31 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.385031] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b5ba59a-803a-4a10-8d74-97e6ae9a6cc6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.395876] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7e16a99a-8207-4a36-b0be-b32910d89bda {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.397415] env[60788]: DEBUG oslo_vmware.api [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Task: {'id': task-2205169, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073334} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 868.397650] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 868.397831] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 868.397997] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 868.398180] env[60788]: INFO nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Took 0.60 seconds to destroy the instance on the hypervisor. [ 868.400413] env[60788]: DEBUG nova.compute.claims [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 868.400585] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 868.400816] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 868.432583] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 868.491184] env[60788]: DEBUG oslo_vmware.rw_handles [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 868.552214] env[60788]: DEBUG oslo_vmware.rw_handles [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 868.552490] env[60788]: DEBUG oslo_vmware.rw_handles [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 868.866882] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea542152-2b19-4307-b26c-556f728ce673 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.874638] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-864358b9-d3fb-4dba-9816-24e4aef446a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.906841] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a91c1db2-05a8-4d40-8c96-f712dbd67633 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.914282] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdaea9cc-edc5-4d6f-ae49-0cd3c044dc3d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 868.928514] env[60788]: DEBUG nova.compute.provider_tree [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 868.938451] env[60788]: DEBUG nova.scheduler.client.report [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 868.956254] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.555s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 868.956802] env[60788]: ERROR nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 868.956802] env[60788]: Faults: ['InvalidArgument'] [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Traceback (most recent call last): [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self.driver.spawn(context, instance, image_meta, [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self._vmops.spawn(context, instance, image_meta, injected_files, [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self._fetch_image_if_missing(context, vi) [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] image_cache(vi, tmp_image_ds_loc) [ 868.956802] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] vm_util.copy_virtual_disk( [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] session._wait_for_task(vmdk_copy_task) [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] return self.wait_for_task(task_ref) [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] return evt.wait() [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] result = hub.switch() [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] return self.greenlet.switch() [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 868.957230] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] self.f(*self.args, **self.kw) [ 868.957643] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 868.957643] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] raise exceptions.translate_fault(task_info.error) [ 868.957643] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 868.957643] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Faults: ['InvalidArgument'] [ 868.957643] env[60788]: ERROR nova.compute.manager [instance: fb58b00e-1a78-4750-b912-48c94144ea66] [ 868.957643] env[60788]: DEBUG nova.compute.utils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 868.959385] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Build of instance fb58b00e-1a78-4750-b912-48c94144ea66 was re-scheduled: A specified parameter was not correct: fileType [ 868.959385] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 868.959766] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 868.959940] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 868.960128] env[60788]: DEBUG nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 868.960326] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 869.454329] env[60788]: DEBUG nova.network.neutron [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 869.468045] env[60788]: INFO nova.compute.manager [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Took 0.51 seconds to deallocate network for instance. [ 869.592725] env[60788]: INFO nova.scheduler.client.report [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Deleted allocations for instance fb58b00e-1a78-4750-b912-48c94144ea66 [ 869.611012] env[60788]: DEBUG oslo_concurrency.lockutils [None req-18dfda95-4e13-419d-b3f5-96cc751f242e tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "fb58b00e-1a78-4750-b912-48c94144ea66" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 343.441s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.612109] env[60788]: DEBUG oslo_concurrency.lockutils [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "fb58b00e-1a78-4750-b912-48c94144ea66" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 146.134s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 869.612345] env[60788]: DEBUG oslo_concurrency.lockutils [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Acquiring lock "fb58b00e-1a78-4750-b912-48c94144ea66-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 869.612552] env[60788]: DEBUG oslo_concurrency.lockutils [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "fb58b00e-1a78-4750-b912-48c94144ea66-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 869.612725] env[60788]: DEBUG oslo_concurrency.lockutils [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "fb58b00e-1a78-4750-b912-48c94144ea66-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 869.614680] env[60788]: INFO nova.compute.manager [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Terminating instance [ 869.616532] env[60788]: DEBUG nova.compute.manager [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 869.616921] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 869.617196] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2cbdbac1-5416-40ab-9090-275717aaeb20 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.623475] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 869.630120] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87b9f3ee-b24c-4a41-a82d-b1899c492f56 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 869.659057] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fb58b00e-1a78-4750-b912-48c94144ea66 could not be found. [ 869.659057] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 869.659250] env[60788]: INFO nova.compute.manager [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Took 0.04 seconds to destroy the instance on the hypervisor. [ 869.659485] env[60788]: DEBUG oslo.service.loopingcall [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 869.661653] env[60788]: DEBUG nova.compute.manager [-] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 869.661768] env[60788]: DEBUG nova.network.neutron [-] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 869.676245] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 869.676491] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 869.677981] env[60788]: INFO nova.compute.claims [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 869.725768] env[60788]: DEBUG nova.network.neutron [-] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 869.736540] env[60788]: INFO nova.compute.manager [-] [instance: fb58b00e-1a78-4750-b912-48c94144ea66] Took 0.07 seconds to deallocate network for instance. [ 869.834763] env[60788]: DEBUG oslo_concurrency.lockutils [None req-bbf58ccf-54b4-4cc3-ba3f-2c3ca0ea7600 tempest-TenantUsagesTestJSON-781894378 tempest-TenantUsagesTestJSON-781894378-project-member] Lock "fb58b00e-1a78-4750-b912-48c94144ea66" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.223s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 870.101718] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3995bfe1-0838-47d1-b3b0-ad408287dc32 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.110207] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c247fba4-94a0-425b-b950-bf46413a2688 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.140664] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c686a1db-955b-437b-8719-74b1cd1f4074 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.149858] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-550a902b-8ac0-4207-ab4d-720a7fe1d8a8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.162068] env[60788]: DEBUG nova.compute.provider_tree [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 870.170146] env[60788]: DEBUG nova.scheduler.client.report [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 870.184607] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.508s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 870.185111] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 870.221670] env[60788]: DEBUG nova.compute.utils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 870.223963] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 870.224182] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 870.232241] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 870.301124] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 870.326324] env[60788]: DEBUG nova.policy [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0dfa112a7b6f4c8881f0e5a3fabcad4d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2980fd75fce447f7b2e2109ca1b3b900', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 870.335051] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 870.335291] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 870.335448] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 870.335626] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 870.335771] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 870.335938] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 870.336182] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 870.336346] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 870.336580] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 870.336774] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 870.337327] env[60788]: DEBUG nova.virt.hardware [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 870.337990] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7d659e8-da39-4c9b-8e3d-23864d9ebaeb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.346950] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fd84034-93ba-454f-b8bf-c8b9f532c226 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 870.905429] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Successfully created port: 86b4af83-e307-40e5-b5e9-d09d5d0c5765 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 872.182936] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Successfully updated port: 86b4af83-e307-40e5-b5e9-d09d5d0c5765 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 872.203982] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "refresh_cache-af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 872.204138] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquired lock "refresh_cache-af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 872.204290] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 872.283678] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 872.388023] env[60788]: DEBUG nova.compute.manager [req-32196fbb-61b2-41c8-b290-0a7bdd6778f4 req-f88a13e3-41ab-4d0a-a146-dabda0ac5b8a service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Received event network-vif-plugged-86b4af83-e307-40e5-b5e9-d09d5d0c5765 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 872.389270] env[60788]: DEBUG oslo_concurrency.lockutils [req-32196fbb-61b2-41c8-b290-0a7bdd6778f4 req-f88a13e3-41ab-4d0a-a146-dabda0ac5b8a service nova] Acquiring lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 872.389490] env[60788]: DEBUG oslo_concurrency.lockutils [req-32196fbb-61b2-41c8-b290-0a7bdd6778f4 req-f88a13e3-41ab-4d0a-a146-dabda0ac5b8a service nova] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 872.389692] env[60788]: DEBUG oslo_concurrency.lockutils [req-32196fbb-61b2-41c8-b290-0a7bdd6778f4 req-f88a13e3-41ab-4d0a-a146-dabda0ac5b8a service nova] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 872.389828] env[60788]: DEBUG nova.compute.manager [req-32196fbb-61b2-41c8-b290-0a7bdd6778f4 req-f88a13e3-41ab-4d0a-a146-dabda0ac5b8a service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] No waiting events found dispatching network-vif-plugged-86b4af83-e307-40e5-b5e9-d09d5d0c5765 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 872.390020] env[60788]: WARNING nova.compute.manager [req-32196fbb-61b2-41c8-b290-0a7bdd6778f4 req-f88a13e3-41ab-4d0a-a146-dabda0ac5b8a service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Received unexpected event network-vif-plugged-86b4af83-e307-40e5-b5e9-d09d5d0c5765 for instance with vm_state building and task_state spawning. [ 872.578569] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Updating instance_info_cache with network_info: [{"id": "86b4af83-e307-40e5-b5e9-d09d5d0c5765", "address": "fa:16:3e:49:7e:17", "network": {"id": "96e4792f-eddd-4214-9eca-d422a947ebf7", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-463497009-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2980fd75fce447f7b2e2109ca1b3b900", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1430a695-49fb-4905-bc38-db9b869a1a9d", "external-id": "nsx-vlan-transportzone-297", "segmentation_id": 297, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86b4af83-e3", "ovs_interfaceid": "86b4af83-e307-40e5-b5e9-d09d5d0c5765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 872.596487] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Releasing lock "refresh_cache-af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 872.597023] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance network_info: |[{"id": "86b4af83-e307-40e5-b5e9-d09d5d0c5765", "address": "fa:16:3e:49:7e:17", "network": {"id": "96e4792f-eddd-4214-9eca-d422a947ebf7", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-463497009-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2980fd75fce447f7b2e2109ca1b3b900", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1430a695-49fb-4905-bc38-db9b869a1a9d", "external-id": "nsx-vlan-transportzone-297", "segmentation_id": 297, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86b4af83-e3", "ovs_interfaceid": "86b4af83-e307-40e5-b5e9-d09d5d0c5765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 872.597216] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:49:7e:17', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1430a695-49fb-4905-bc38-db9b869a1a9d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '86b4af83-e307-40e5-b5e9-d09d5d0c5765', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 872.608310] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Creating folder: Project (2980fd75fce447f7b2e2109ca1b3b900). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 872.608310] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c5676989-c141-4671-9517-7efdd765c110 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.619650] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Created folder: Project (2980fd75fce447f7b2e2109ca1b3b900) in parent group-v449747. [ 872.620201] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Creating folder: Instances. Parent ref: group-v449798. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 872.620201] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0e69ce3c-e661-42e7-b7e9-3c00a6fce22b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.628380] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Created folder: Instances in parent group-v449798. [ 872.629238] env[60788]: DEBUG oslo.service.loopingcall [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 872.629715] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 872.632855] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fa55a10b-6a35-4b33-8e08-45e23e9a2407 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 872.651504] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 872.651504] env[60788]: value = "task-2205172" [ 872.651504] env[60788]: _type = "Task" [ 872.651504] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 872.661282] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205172, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 873.161679] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205172, 'name': CreateVM_Task, 'duration_secs': 0.313065} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 873.161869] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 873.162882] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 873.163079] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 873.163520] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 873.163785] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-096d1950-5839-4540-ab6d-d2a7bd3eddea {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 873.168477] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Waiting for the task: (returnval){ [ 873.168477] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52c40ee8-9c76-a8ca-2f0e-496cb1b42cf2" [ 873.168477] env[60788]: _type = "Task" [ 873.168477] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 873.176328] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52c40ee8-9c76-a8ca-2f0e-496cb1b42cf2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 873.682532] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 873.684591] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 873.685105] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 874.652747] env[60788]: DEBUG nova.compute.manager [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Received event network-changed-86b4af83-e307-40e5-b5e9-d09d5d0c5765 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 874.652747] env[60788]: DEBUG nova.compute.manager [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Refreshing instance network info cache due to event network-changed-86b4af83-e307-40e5-b5e9-d09d5d0c5765. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 874.652747] env[60788]: DEBUG oslo_concurrency.lockutils [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] Acquiring lock "refresh_cache-af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 874.653393] env[60788]: DEBUG oslo_concurrency.lockutils [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] Acquired lock "refresh_cache-af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 874.653486] env[60788]: DEBUG nova.network.neutron [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Refreshing network info cache for port 86b4af83-e307-40e5-b5e9-d09d5d0c5765 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 874.896204] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "e5084b03-325e-40db-9ffc-0467d53adf38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 874.896538] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "e5084b03-325e-40db-9ffc-0467d53adf38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 875.390487] env[60788]: DEBUG nova.network.neutron [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Updated VIF entry in instance network info cache for port 86b4af83-e307-40e5-b5e9-d09d5d0c5765. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 875.390909] env[60788]: DEBUG nova.network.neutron [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Updating instance_info_cache with network_info: [{"id": "86b4af83-e307-40e5-b5e9-d09d5d0c5765", "address": "fa:16:3e:49:7e:17", "network": {"id": "96e4792f-eddd-4214-9eca-d422a947ebf7", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-463497009-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2980fd75fce447f7b2e2109ca1b3b900", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1430a695-49fb-4905-bc38-db9b869a1a9d", "external-id": "nsx-vlan-transportzone-297", "segmentation_id": 297, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap86b4af83-e3", "ovs_interfaceid": "86b4af83-e307-40e5-b5e9-d09d5d0c5765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 875.401015] env[60788]: DEBUG oslo_concurrency.lockutils [req-87a47faf-3115-4f0d-928e-87f82a81bbc1 req-56a81f7a-17a9-41c9-b4e6-36fb6c9315a3 service nova] Releasing lock "refresh_cache-af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 879.596831] env[60788]: DEBUG oslo_concurrency.lockutils [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 890.754986] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 892.748734] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 892.754055] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 892.754055] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 892.754055] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 892.766627] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 892.766893] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 892.767102] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 892.767306] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 892.768442] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cd1ccff-b5ff-48e7-b267-92a3b9da20ab {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 892.777454] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-374e8f73-c3e9-4fb2-820b-98953a3e07e8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 892.791479] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6236d088-5b08-4b7b-9328-2cd0b6dfa54d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 892.799238] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-995fa3f5-429e-4fa0-95d0-22d6773e6e9c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 892.829860] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181179MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 892.830038] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 892.830222] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 892.902542] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 331fc548-2076-48e2-a84b-94130a99c2ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.902640] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.902732] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.902904] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.902989] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.903357] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.903357] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.903357] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.903621] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.903621] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 892.915960] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.926363] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6a101160-4c4a-42a5-9dfa-e7f41aa9788a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.936597] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 57c974dd-093c-44c1-ab08-1659e25bb392 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.946581] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d5972768-f55f-495b-a49f-43b00c4647c2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.957584] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 911b94d1-8c01-49fa-ae13-4565a028676e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.970112] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.979878] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d136a94d-344a-4697-97b5-3d732a16f4a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.989599] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1ae1eb4b-4696-4592-a758-79b2211d35c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 892.999162] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4bfbddb3-f66c-4059-8624-654e180ab997 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.009756] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5ebb3604-792d-4fd7-95e6-d8a826c2d50a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.020662] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b7b0591b-123b-49ad-8ab6-6881d0c7888b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.030342] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 71ac0cb5-ebac-4f22-897c-1742b5416fca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.040212] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 55ccb77a-7c54-4e4f-a665-43dc1c30e595 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.051098] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 71acb134-101c-482b-9e5f-bbc18b8e01d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.061439] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 9da7df70-e116-4bf9-83fb-626208162b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.074602] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f39ca342-04ed-45d7-8017-717d3a9ba244 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.084942] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b4004d4f-8a7f-42be-9ce4-5ab53ae62f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.095056] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 822c4411-1759-4d9e-820a-5d617fdd2488 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.104967] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 61d9ecf8-0ed5-4451-9953-e53cabecf36b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.114479] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.123765] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 893.124015] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 893.124183] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 893.466613] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f535a77b-9273-4fe5-a266-e6a677400e59 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 893.474304] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63dc618a-9d95-4e0f-9e71-1a2da324dd24 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 893.504503] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-557dd156-d71f-48a6-83ad-96d665e09503 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 893.511438] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c7a9ad3-513d-44e8-b9d7-b753b6d80207 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 893.524268] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 893.533201] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 893.550368] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 893.550521] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.720s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 894.551490] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 894.551814] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 894.551814] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 894.571194] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.571349] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.571481] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.571606] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.571742] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.571864] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.571982] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.572142] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.572274] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.572395] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 894.572516] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 894.572962] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 895.753542] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 895.753853] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 897.753737] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 914.529033] env[60788]: WARNING oslo_vmware.rw_handles [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 914.529033] env[60788]: ERROR oslo_vmware.rw_handles [ 914.529033] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 914.531350] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 914.531660] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Copying Virtual Disk [datastore2] vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/d8ed67aa-c2e2-44d1-8869-d85e37a1ed50/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 914.532023] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-64030b69-6081-4498-9c92-e3b5c296e3f5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.540458] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Waiting for the task: (returnval){ [ 914.540458] env[60788]: value = "task-2205173" [ 914.540458] env[60788]: _type = "Task" [ 914.540458] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 914.549511] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Task: {'id': task-2205173, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 915.052036] env[60788]: DEBUG oslo_vmware.exceptions [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 915.052036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 915.052036] env[60788]: ERROR nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 915.052036] env[60788]: Faults: ['InvalidArgument'] [ 915.052036] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Traceback (most recent call last): [ 915.052036] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 915.052036] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] yield resources [ 915.052036] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self.driver.spawn(context, instance, image_meta, [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self._fetch_image_if_missing(context, vi) [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] image_cache(vi, tmp_image_ds_loc) [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] vm_util.copy_virtual_disk( [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] session._wait_for_task(vmdk_copy_task) [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 915.052352] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] return self.wait_for_task(task_ref) [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] return evt.wait() [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] result = hub.switch() [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] return self.greenlet.switch() [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self.f(*self.args, **self.kw) [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] raise exceptions.translate_fault(task_info.error) [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Faults: ['InvalidArgument'] [ 915.052616] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] [ 915.052922] env[60788]: INFO nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Terminating instance [ 915.053330] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 915.053540] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 915.053773] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9b7e119a-f067-4855-8216-bddaeb58e0f1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.056012] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 915.056225] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 915.056938] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f957f6e5-e4ab-4763-a2b6-76d55a359d24 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.064737] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 915.064976] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f8bb6ea9-b1e4-4412-a0fc-666e5bd6d1fb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.067244] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 915.067416] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 915.068385] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c6026f9-c83a-456f-8a0d-4cd5c8dc659b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.073122] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 915.073122] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]528ae410-5831-6743-e95e-3d0d74ff8abc" [ 915.073122] env[60788]: _type = "Task" [ 915.073122] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 915.080852] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]528ae410-5831-6743-e95e-3d0d74ff8abc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 915.136713] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 915.136945] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 915.137220] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Deleting the datastore file [datastore2] 331fc548-2076-48e2-a84b-94130a99c2ca {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 915.137501] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6ae7903e-c3ea-44cf-ba70-d41cf49fc321 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.143617] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Waiting for the task: (returnval){ [ 915.143617] env[60788]: value = "task-2205175" [ 915.143617] env[60788]: _type = "Task" [ 915.143617] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 915.151152] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Task: {'id': task-2205175, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 915.583346] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 915.583630] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating directory with path [datastore2] vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 915.583841] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-932f37c9-f239-4ae7-b3df-7c887e6bb26e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.595489] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Created directory with path [datastore2] vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 915.595690] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Fetch image to [datastore2] vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 915.595892] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 915.596651] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fce2dae3-8a2b-4ce8-b52f-09c08ab41ee1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.603065] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27f66906-ab9c-4d91-b0fc-6fac92434613 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.611968] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac6b7929-bdaf-448c-9550-e5145db1b018 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.642702] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-953d691e-c253-49e1-a167-a8b79bf551f7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.654036] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-72a568c3-2ad0-4a46-8860-0dedba0ffbe9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 915.655741] env[60788]: DEBUG oslo_vmware.api [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Task: {'id': task-2205175, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07839} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 915.656009] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 915.656205] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 915.656374] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 915.656545] env[60788]: INFO nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Took 0.60 seconds to destroy the instance on the hypervisor. [ 915.658972] env[60788]: DEBUG nova.compute.claims [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 915.659097] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 915.659276] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 915.676562] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 915.730469] env[60788]: DEBUG oslo_vmware.rw_handles [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 915.791373] env[60788]: DEBUG oslo_vmware.rw_handles [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 915.791577] env[60788]: DEBUG oslo_vmware.rw_handles [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 916.113484] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac075ff-e8d8-42b5-bf61-d861c0cde995 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.122107] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75664ec9-86cc-4249-abb3-b2acfd56e463 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.152886] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe2b3c72-b27f-4133-bce3-d56f31345ea5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.159861] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3bbeb72-8a6b-4e81-8ae2-8119d831a519 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 916.172771] env[60788]: DEBUG nova.compute.provider_tree [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 916.182993] env[60788]: DEBUG nova.scheduler.client.report [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 916.196443] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.537s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 916.196917] env[60788]: ERROR nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 916.196917] env[60788]: Faults: ['InvalidArgument'] [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Traceback (most recent call last): [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self.driver.spawn(context, instance, image_meta, [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self._fetch_image_if_missing(context, vi) [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] image_cache(vi, tmp_image_ds_loc) [ 916.196917] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] vm_util.copy_virtual_disk( [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] session._wait_for_task(vmdk_copy_task) [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] return self.wait_for_task(task_ref) [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] return evt.wait() [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] result = hub.switch() [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] return self.greenlet.switch() [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 916.197319] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] self.f(*self.args, **self.kw) [ 916.197645] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 916.197645] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] raise exceptions.translate_fault(task_info.error) [ 916.197645] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 916.197645] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Faults: ['InvalidArgument'] [ 916.197645] env[60788]: ERROR nova.compute.manager [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] [ 916.197645] env[60788]: DEBUG nova.compute.utils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 916.199101] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Build of instance 331fc548-2076-48e2-a84b-94130a99c2ca was re-scheduled: A specified parameter was not correct: fileType [ 916.199101] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 916.199542] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 916.199717] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 916.199871] env[60788]: DEBUG nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 916.200072] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 916.908780] env[60788]: DEBUG nova.network.neutron [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 916.926133] env[60788]: INFO nova.compute.manager [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Took 0.73 seconds to deallocate network for instance. [ 917.030058] env[60788]: INFO nova.scheduler.client.report [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Deleted allocations for instance 331fc548-2076-48e2-a84b-94130a99c2ca [ 917.050018] env[60788]: DEBUG oslo_concurrency.lockutils [None req-37c51232-f52a-4bfd-b9f1-bdf6091d33c6 tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "331fc548-2076-48e2-a84b-94130a99c2ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 384.679s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.050018] env[60788]: DEBUG oslo_concurrency.lockutils [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "331fc548-2076-48e2-a84b-94130a99c2ca" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 184.793s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 917.050018] env[60788]: DEBUG oslo_concurrency.lockutils [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Acquiring lock "331fc548-2076-48e2-a84b-94130a99c2ca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 917.050230] env[60788]: DEBUG oslo_concurrency.lockutils [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "331fc548-2076-48e2-a84b-94130a99c2ca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 917.050230] env[60788]: DEBUG oslo_concurrency.lockutils [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "331fc548-2076-48e2-a84b-94130a99c2ca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.051393] env[60788]: INFO nova.compute.manager [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Terminating instance [ 917.053865] env[60788]: DEBUG nova.compute.manager [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 917.054386] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 917.055431] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a3856f98-530b-4c52-8b39-9d79f37938fa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.066542] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f63db1-9ebd-4bdc-8f8f-5302e573f657 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.079030] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 917.099465] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 331fc548-2076-48e2-a84b-94130a99c2ca could not be found. [ 917.099841] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 917.100227] env[60788]: INFO nova.compute.manager [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Took 0.05 seconds to destroy the instance on the hypervisor. [ 917.100612] env[60788]: DEBUG oslo.service.loopingcall [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 917.100948] env[60788]: DEBUG nova.compute.manager [-] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 917.101192] env[60788]: DEBUG nova.network.neutron [-] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 917.134091] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 917.134091] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 917.134091] env[60788]: INFO nova.compute.claims [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 917.137799] env[60788]: DEBUG nova.network.neutron [-] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 917.145655] env[60788]: INFO nova.compute.manager [-] [instance: 331fc548-2076-48e2-a84b-94130a99c2ca] Took 0.04 seconds to deallocate network for instance. [ 917.285246] env[60788]: DEBUG oslo_concurrency.lockutils [None req-506dd8ae-d761-4d76-88c3-c8b2e86d0ffa tempest-ServersAdminNegativeTestJSON-478656401 tempest-ServersAdminNegativeTestJSON-478656401-project-member] Lock "331fc548-2076-48e2-a84b-94130a99c2ca" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.237s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.590637] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f502d33e-3962-426b-8dff-786b1093d5af {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.598229] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a00e5df2-6007-43de-a442-0545d14e05b3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.628964] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e29dfe0d-756a-4b05-9e92-fbfb978dec47 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.641372] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f069a68-5b84-4ed9-a5ec-3345e8b3dd25 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.661474] env[60788]: DEBUG nova.compute.provider_tree [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 917.671032] env[60788]: DEBUG nova.scheduler.client.report [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 917.686349] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.554s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 917.686868] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 917.725706] env[60788]: DEBUG nova.compute.utils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 917.727059] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 917.727237] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 917.739160] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 917.802811] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 917.828217] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 917.828472] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 917.828632] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 917.828808] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 917.831905] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 917.831905] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 917.831905] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 917.831905] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 917.831905] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 917.832378] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 917.832378] env[60788]: DEBUG nova.virt.hardware [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 917.832378] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4031b73f-c044-4261-a4f4-29104a5bd139 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.839973] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-480e3a8b-bec5-4289-b5ea-1d83fdb6705d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 917.876767] env[60788]: DEBUG nova.policy [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2622e7e3d8424bcb8dc24406bff81ac1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e5e9ec9d68c04b37810fae19866f3a0b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 918.518088] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Successfully created port: 776da8f5-9623-4034-9a41-e8227d0de19e {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 919.840646] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Successfully updated port: 776da8f5-9623-4034-9a41-e8227d0de19e {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 919.857960] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "refresh_cache-5c7c0b6d-d4ea-4c78-8a76-934859d6571e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 919.857960] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired lock "refresh_cache-5c7c0b6d-d4ea-4c78-8a76-934859d6571e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 919.857960] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 919.878107] env[60788]: DEBUG nova.compute.manager [req-b5bcaa0d-e7a3-4686-8300-680f719f7820 req-45f05dfd-5700-4fa4-a9d8-d2a85e76d195 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Received event network-vif-plugged-776da8f5-9623-4034-9a41-e8227d0de19e {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 919.878328] env[60788]: DEBUG oslo_concurrency.lockutils [req-b5bcaa0d-e7a3-4686-8300-680f719f7820 req-45f05dfd-5700-4fa4-a9d8-d2a85e76d195 service nova] Acquiring lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 919.878528] env[60788]: DEBUG oslo_concurrency.lockutils [req-b5bcaa0d-e7a3-4686-8300-680f719f7820 req-45f05dfd-5700-4fa4-a9d8-d2a85e76d195 service nova] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 919.878690] env[60788]: DEBUG oslo_concurrency.lockutils [req-b5bcaa0d-e7a3-4686-8300-680f719f7820 req-45f05dfd-5700-4fa4-a9d8-d2a85e76d195 service nova] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 919.878850] env[60788]: DEBUG nova.compute.manager [req-b5bcaa0d-e7a3-4686-8300-680f719f7820 req-45f05dfd-5700-4fa4-a9d8-d2a85e76d195 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] No waiting events found dispatching network-vif-plugged-776da8f5-9623-4034-9a41-e8227d0de19e {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 919.879014] env[60788]: WARNING nova.compute.manager [req-b5bcaa0d-e7a3-4686-8300-680f719f7820 req-45f05dfd-5700-4fa4-a9d8-d2a85e76d195 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Received unexpected event network-vif-plugged-776da8f5-9623-4034-9a41-e8227d0de19e for instance with vm_state building and task_state spawning. [ 919.920061] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 920.203024] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Updating instance_info_cache with network_info: [{"id": "776da8f5-9623-4034-9a41-e8227d0de19e", "address": "fa:16:3e:2e:20:19", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap776da8f5-96", "ovs_interfaceid": "776da8f5-9623-4034-9a41-e8227d0de19e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 920.220525] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Releasing lock "refresh_cache-5c7c0b6d-d4ea-4c78-8a76-934859d6571e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 920.220843] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance network_info: |[{"id": "776da8f5-9623-4034-9a41-e8227d0de19e", "address": "fa:16:3e:2e:20:19", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap776da8f5-96", "ovs_interfaceid": "776da8f5-9623-4034-9a41-e8227d0de19e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 920.221265] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2e:20:19', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '776da8f5-9623-4034-9a41-e8227d0de19e', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 920.228888] env[60788]: DEBUG oslo.service.loopingcall [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 920.229396] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 920.229631] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1185eb17-49fa-4f0c-aa4e-edd8d984a93e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 920.251872] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 920.251872] env[60788]: value = "task-2205176" [ 920.251872] env[60788]: _type = "Task" [ 920.251872] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 920.262334] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205176, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 920.511956] env[60788]: DEBUG oslo_concurrency.lockutils [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 920.762864] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205176, 'name': CreateVM_Task} progress is 99%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 921.263908] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205176, 'name': CreateVM_Task} progress is 99%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 921.763875] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205176, 'name': CreateVM_Task, 'duration_secs': 1.312353} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 921.764164] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 921.764829] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 921.765043] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 921.765368] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 921.765639] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-71b6fab0-7e25-4f08-bfa6-967b9272439e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 921.770081] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 921.770081] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52bed8a6-a4c7-5897-5480-6811ec5c0abc" [ 921.770081] env[60788]: _type = "Task" [ 921.770081] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 921.777587] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52bed8a6-a4c7-5897-5480-6811ec5c0abc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 922.017027] env[60788]: DEBUG nova.compute.manager [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Received event network-changed-776da8f5-9623-4034-9a41-e8227d0de19e {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 922.017145] env[60788]: DEBUG nova.compute.manager [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Refreshing instance network info cache due to event network-changed-776da8f5-9623-4034-9a41-e8227d0de19e. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 922.017424] env[60788]: DEBUG oslo_concurrency.lockutils [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] Acquiring lock "refresh_cache-5c7c0b6d-d4ea-4c78-8a76-934859d6571e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 922.017569] env[60788]: DEBUG oslo_concurrency.lockutils [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] Acquired lock "refresh_cache-5c7c0b6d-d4ea-4c78-8a76-934859d6571e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 922.017730] env[60788]: DEBUG nova.network.neutron [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Refreshing network info cache for port 776da8f5-9623-4034-9a41-e8227d0de19e {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 922.282420] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 922.282711] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 922.283190] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 922.643300] env[60788]: DEBUG nova.network.neutron [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Updated VIF entry in instance network info cache for port 776da8f5-9623-4034-9a41-e8227d0de19e. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 922.643668] env[60788]: DEBUG nova.network.neutron [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Updating instance_info_cache with network_info: [{"id": "776da8f5-9623-4034-9a41-e8227d0de19e", "address": "fa:16:3e:2e:20:19", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.216", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap776da8f5-96", "ovs_interfaceid": "776da8f5-9623-4034-9a41-e8227d0de19e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 922.658891] env[60788]: DEBUG oslo_concurrency.lockutils [req-3f4b1b1f-2890-47e2-bbcb-bdf1ba0eee47 req-97d5353c-5279-4143-a175-1796cb350ea2 service nova] Releasing lock "refresh_cache-5c7c0b6d-d4ea-4c78-8a76-934859d6571e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 933.079890] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "529472d7-5e71-4997-96de-64d41b9d3515" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 933.082311] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "529472d7-5e71-4997-96de-64d41b9d3515" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 935.848076] env[60788]: DEBUG oslo_concurrency.lockutils [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.202768] env[60788]: DEBUG oslo_concurrency.lockutils [None req-756087a5-abc9-49f2-bbb7-12d03d5ae899 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "cceddbb3-f076-4b72-882f-71432f8f0a81" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.203036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-756087a5-abc9-49f2-bbb7-12d03d5ae899 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "cceddbb3-f076-4b72-882f-71432f8f0a81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 937.381522] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6caf0247-3988-4f45-a998-126613f0c5d8 tempest-ServerRescueTestJSON-23639953 tempest-ServerRescueTestJSON-23639953-project-member] Acquiring lock "2a45087b-e101-49fd-b102-abb56b8b88e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 937.381754] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6caf0247-3988-4f45-a998-126613f0c5d8 tempest-ServerRescueTestJSON-23639953 tempest-ServerRescueTestJSON-23639953-project-member] Lock "2a45087b-e101-49fd-b102-abb56b8b88e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 950.227659] env[60788]: DEBUG oslo_concurrency.lockutils [None req-008ae44a-6601-4421-84df-c8d1a9b92f59 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Acquiring lock "b9ce0d5b-0ee9-4585-9265-10a96ea62752" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 950.227978] env[60788]: DEBUG oslo_concurrency.lockutils [None req-008ae44a-6601-4421-84df-c8d1a9b92f59 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Lock "b9ce0d5b-0ee9-4585-9265-10a96ea62752" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 950.754255] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 952.749580] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 953.754032] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 953.754403] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 953.754649] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 953.754860] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 953.778334] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.778578] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.778781] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.779058] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.779304] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.779528] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.779671] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.779798] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.779919] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.780155] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 953.780341] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 954.754525] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.754857] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 954.754939] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 954.766019] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 954.766246] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 954.766410] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 954.766567] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 954.767738] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6129dc0-752b-4256-b700-28436e2ed4be {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 954.777507] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37a6500c-50ba-42f9-9fb0-b9f521b52ac0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 954.794100] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbd8e365-b93d-4778-9df9-cdab75b86294 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 954.801316] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-370e4f72-6151-41be-8ddf-64624ace2d8c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 954.831351] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181178MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 954.831534] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 954.831705] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 954.912752] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.912930] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 80e7296f-45ed-4987-9884-05bd883f4144 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913071] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913200] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913321] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913439] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913554] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913670] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913824] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.913955] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 954.925597] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d5972768-f55f-495b-a49f-43b00c4647c2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 954.938479] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 911b94d1-8c01-49fa-ae13-4565a028676e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 954.949708] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 954.960316] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d136a94d-344a-4697-97b5-3d732a16f4a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 954.972854] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1ae1eb4b-4696-4592-a758-79b2211d35c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 954.983035] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4bfbddb3-f66c-4059-8624-654e180ab997 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 954.996021] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5ebb3604-792d-4fd7-95e6-d8a826c2d50a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.005246] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b7b0591b-123b-49ad-8ab6-6881d0c7888b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.015346] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 71ac0cb5-ebac-4f22-897c-1742b5416fca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.025065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 55ccb77a-7c54-4e4f-a665-43dc1c30e595 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.035771] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 71acb134-101c-482b-9e5f-bbc18b8e01d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.045223] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 9da7df70-e116-4bf9-83fb-626208162b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.055192] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f39ca342-04ed-45d7-8017-717d3a9ba244 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.064861] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b4004d4f-8a7f-42be-9ce4-5ab53ae62f78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.074133] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 822c4411-1759-4d9e-820a-5d617fdd2488 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.083476] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 61d9ecf8-0ed5-4451-9953-e53cabecf36b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.092607] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.103574] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.113227] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.123020] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance cceddbb3-f076-4b72-882f-71432f8f0a81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.132662] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2a45087b-e101-49fd-b102-abb56b8b88e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.145866] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b9ce0d5b-0ee9-4585-9265-10a96ea62752 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 955.146158] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 955.146391] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 955.535347] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be7b2374-8803-4b64-9025-8ee22543930e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.543924] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69555468-a854-4ed6-8cb0-63e77ddddea4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.575998] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-216924f4-81da-4db9-90ed-26bbb7ff6931 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.585205] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5360a94-8b6b-4a3d-90fb-9de78cfce9b4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 955.599191] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 955.608500] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 955.672885] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 955.672885] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.807s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 956.639920] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.639920] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 956.753128] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 959.754137] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 959.985252] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c07ad454-ec60-4ee6-9a01-5a2004ff26b2 tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] Acquiring lock "3da33ce7-b346-4970-b4ab-36a74c67d3dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 959.985526] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c07ad454-ec60-4ee6-9a01-5a2004ff26b2 tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] Lock "3da33ce7-b346-4970-b4ab-36a74c67d3dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 961.765769] env[60788]: WARNING oslo_vmware.rw_handles [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 961.765769] env[60788]: ERROR oslo_vmware.rw_handles [ 961.766326] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 961.768991] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 961.769475] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Copying Virtual Disk [datastore2] vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/f39b78d9-e18e-4845-887b-3b13e367fa86/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 961.769861] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9137b24b-613a-42b1-a7b3-bd20bec9f617 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 961.782057] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 961.782057] env[60788]: value = "task-2205187" [ 961.782057] env[60788]: _type = "Task" [ 961.782057] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 961.795409] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': task-2205187, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.293471] env[60788]: DEBUG oslo_vmware.exceptions [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 962.293772] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 962.294353] env[60788]: ERROR nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 962.294353] env[60788]: Faults: ['InvalidArgument'] [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Traceback (most recent call last): [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] yield resources [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self.driver.spawn(context, instance, image_meta, [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self._vmops.spawn(context, instance, image_meta, injected_files, [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self._fetch_image_if_missing(context, vi) [ 962.294353] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] image_cache(vi, tmp_image_ds_loc) [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] vm_util.copy_virtual_disk( [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] session._wait_for_task(vmdk_copy_task) [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] return self.wait_for_task(task_ref) [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] return evt.wait() [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] result = hub.switch() [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 962.294719] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] return self.greenlet.switch() [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self.f(*self.args, **self.kw) [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] raise exceptions.translate_fault(task_info.error) [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Faults: ['InvalidArgument'] [ 962.295111] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] [ 962.295111] env[60788]: INFO nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Terminating instance [ 962.296294] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 962.296517] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 962.296758] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-84a09d81-f859-4aaa-bfd0-612db4b2830e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.299481] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 962.299725] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 962.300414] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2dcaf55-8122-48f0-9d39-8e1baa9f89d0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.311279] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 962.311514] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c000fae3-2f33-4d19-a739-9f55cc4cf7f3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.313909] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 962.314099] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 962.315098] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a3c7c732-e297-4c65-bf5a-5b26294d7b50 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.323525] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 962.323525] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52c3a368-654c-ab48-4152-0d5bc2795346" [ 962.323525] env[60788]: _type = "Task" [ 962.323525] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 962.332138] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52c3a368-654c-ab48-4152-0d5bc2795346, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.387708] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 962.387953] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 962.388146] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Deleting the datastore file [datastore2] ad93b5d9-8983-4aca-a5ee-3e48f1682122 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 962.388462] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cae7279a-14de-4f28-9445-0c141897f59a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.398518] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 962.398518] env[60788]: value = "task-2205189" [ 962.398518] env[60788]: _type = "Task" [ 962.398518] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 962.407223] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': task-2205189, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 962.835194] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 962.835477] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 962.835703] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-17a0745f-0304-46b9-a6ac-7799513548b7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.849192] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 962.850353] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Fetch image to [datastore2] vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 962.850353] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 962.850494] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c88b6659-c4eb-4e81-8178-6adc070c9017 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.858170] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e17f861-f62f-4309-8387-53017da6c216 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.868582] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7dff61c-e82d-4926-8fab-ecc7030dae84 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.907620] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95c7578b-0ce5-4e36-9122-d6b9b2f3ce00 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.918959] env[60788]: DEBUG oslo_vmware.api [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': task-2205189, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080916} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 962.919514] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 962.919702] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 962.919871] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 962.920060] env[60788]: INFO nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Took 0.62 seconds to destroy the instance on the hypervisor. [ 962.921578] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-abea61d3-51ce-4f78-a1d2-b1a1e7a87197 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 962.923579] env[60788]: DEBUG nova.compute.claims [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 962.923755] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 962.923969] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 962.957343] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 963.017757] env[60788]: DEBUG oslo_vmware.rw_handles [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 963.089011] env[60788]: DEBUG oslo_vmware.rw_handles [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 963.089716] env[60788]: DEBUG oslo_vmware.rw_handles [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 963.430194] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01f749a-4e66-4def-b1b1-cb24e12ba593 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.438910] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de212a8-4584-46d5-a9b6-433f4cd75a92 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.469112] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b7176e0-2bbd-4366-ab75-5ae67659ad2b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.476951] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca759fc3-bb46-4a8e-be90-92b333d0e494 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 963.493159] env[60788]: DEBUG nova.compute.provider_tree [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 963.502235] env[60788]: DEBUG nova.scheduler.client.report [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 963.520159] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.596s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 963.520708] env[60788]: ERROR nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 963.520708] env[60788]: Faults: ['InvalidArgument'] [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Traceback (most recent call last): [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self.driver.spawn(context, instance, image_meta, [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self._vmops.spawn(context, instance, image_meta, injected_files, [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self._fetch_image_if_missing(context, vi) [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] image_cache(vi, tmp_image_ds_loc) [ 963.520708] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] vm_util.copy_virtual_disk( [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] session._wait_for_task(vmdk_copy_task) [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] return self.wait_for_task(task_ref) [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] return evt.wait() [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] result = hub.switch() [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] return self.greenlet.switch() [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 963.521061] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] self.f(*self.args, **self.kw) [ 963.521358] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 963.521358] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] raise exceptions.translate_fault(task_info.error) [ 963.521358] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 963.521358] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Faults: ['InvalidArgument'] [ 963.521358] env[60788]: ERROR nova.compute.manager [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] [ 963.521550] env[60788]: DEBUG nova.compute.utils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 963.523037] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Build of instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 was re-scheduled: A specified parameter was not correct: fileType [ 963.523037] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 963.523442] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 963.523618] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 963.524068] env[60788]: DEBUG nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 963.524269] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 964.276180] env[60788]: DEBUG nova.network.neutron [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 964.293156] env[60788]: INFO nova.compute.manager [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Took 0.77 seconds to deallocate network for instance. [ 964.408759] env[60788]: INFO nova.scheduler.client.report [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Deleted allocations for instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 [ 964.438166] env[60788]: DEBUG oslo_concurrency.lockutils [None req-dc159e8a-4399-4ff1-a2d0-bb6f11048c88 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 429.580s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.439399] env[60788]: DEBUG oslo_concurrency.lockutils [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 28.591s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.439617] env[60788]: DEBUG oslo_concurrency.lockutils [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 964.439819] env[60788]: DEBUG oslo_concurrency.lockutils [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.439986] env[60788]: DEBUG oslo_concurrency.lockutils [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.442878] env[60788]: INFO nova.compute.manager [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Terminating instance [ 964.444943] env[60788]: DEBUG nova.compute.manager [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 964.445150] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 964.445599] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-49044f00-3ab8-4aff-8cfc-f6822856db11 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.456504] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47cfcafe-0349-4ef7-8ae5-44fb14683e60 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 964.468053] env[60788]: DEBUG nova.compute.manager [None req-8ac96037-c57f-46b4-9dea-9df48c62b07f tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] [instance: 6a101160-4c4a-42a5-9dfa-e7f41aa9788a] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 964.475391] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6465b186-24ca-4e90-b6e9-d3908cdfd1e1 tempest-ServersTestBootFromVolume-1578403687 tempest-ServersTestBootFromVolume-1578403687-project-member] Acquiring lock "854ad83b-7e4d-4f0d-b6d9-4e9492fc3461" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 964.475603] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6465b186-24ca-4e90-b6e9-d3908cdfd1e1 tempest-ServersTestBootFromVolume-1578403687 tempest-ServersTestBootFromVolume-1578403687-project-member] Lock "854ad83b-7e4d-4f0d-b6d9-4e9492fc3461" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.497509] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad93b5d9-8983-4aca-a5ee-3e48f1682122 could not be found. [ 964.497715] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 964.497893] env[60788]: INFO nova.compute.manager [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Took 0.05 seconds to destroy the instance on the hypervisor. [ 964.498155] env[60788]: DEBUG oslo.service.loopingcall [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 964.498564] env[60788]: DEBUG nova.compute.manager [None req-8ac96037-c57f-46b4-9dea-9df48c62b07f tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] [instance: 6a101160-4c4a-42a5-9dfa-e7f41aa9788a] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 964.499653] env[60788]: DEBUG nova.compute.manager [-] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 964.499756] env[60788]: DEBUG nova.network.neutron [-] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 964.520622] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8ac96037-c57f-46b4-9dea-9df48c62b07f tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Lock "6a101160-4c4a-42a5-9dfa-e7f41aa9788a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.626s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.530198] env[60788]: DEBUG nova.compute.manager [None req-4b9a2a04-2478-44aa-9c13-20e050488769 tempest-ServerRescueTestJSONUnderV235-1049840473 tempest-ServerRescueTestJSONUnderV235-1049840473-project-member] [instance: 57c974dd-093c-44c1-ab08-1659e25bb392] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 964.554962] env[60788]: DEBUG nova.compute.manager [None req-4b9a2a04-2478-44aa-9c13-20e050488769 tempest-ServerRescueTestJSONUnderV235-1049840473 tempest-ServerRescueTestJSONUnderV235-1049840473-project-member] [instance: 57c974dd-093c-44c1-ab08-1659e25bb392] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 964.585979] env[60788]: DEBUG nova.network.neutron [-] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 964.595655] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4b9a2a04-2478-44aa-9c13-20e050488769 tempest-ServerRescueTestJSONUnderV235-1049840473 tempest-ServerRescueTestJSONUnderV235-1049840473-project-member] Lock "57c974dd-093c-44c1-ab08-1659e25bb392" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.257s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.597851] env[60788]: INFO nova.compute.manager [-] [instance: ad93b5d9-8983-4aca-a5ee-3e48f1682122] Took 0.10 seconds to deallocate network for instance. [ 964.609841] env[60788]: DEBUG nova.compute.manager [None req-f8ea6717-4f26-410a-a4f3-f6020418bb0b tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] [instance: d5972768-f55f-495b-a49f-43b00c4647c2] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 964.644484] env[60788]: DEBUG nova.compute.manager [None req-f8ea6717-4f26-410a-a4f3-f6020418bb0b tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] [instance: d5972768-f55f-495b-a49f-43b00c4647c2] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 964.684552] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f8ea6717-4f26-410a-a4f3-f6020418bb0b tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] Lock "d5972768-f55f-495b-a49f-43b00c4647c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.986s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.699779] env[60788]: DEBUG nova.compute.manager [None req-f241914f-1c26-4436-8658-8f9894f2e61e tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] [instance: 911b94d1-8c01-49fa-ae13-4565a028676e] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 964.726321] env[60788]: DEBUG oslo_concurrency.lockutils [None req-271562e5-74e4-4385-84d0-5b25ca5145cf tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "ad93b5d9-8983-4aca-a5ee-3e48f1682122" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.287s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.729317] env[60788]: DEBUG nova.compute.manager [None req-f241914f-1c26-4436-8658-8f9894f2e61e tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] [instance: 911b94d1-8c01-49fa-ae13-4565a028676e] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 964.749407] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f241914f-1c26-4436-8658-8f9894f2e61e tempest-VolumesAdminNegativeTest-360217904 tempest-VolumesAdminNegativeTest-360217904-project-member] Lock "911b94d1-8c01-49fa-ae13-4565a028676e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.785s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 964.758196] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 964.818619] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 964.818713] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 964.822154] env[60788]: INFO nova.compute.claims [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 965.372385] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e5d7b7e-0df6-4c43-bcb1-dbce6522e60c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.381940] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd4dfc8a-e068-405c-bc93-527c0d07d83b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.415609] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c342b552-5539-4ff1-b2a4-677144b06b80 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.424567] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4304dd5-918d-4308-8035-53ba03cfcd12 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.438846] env[60788]: DEBUG nova.compute.provider_tree [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 965.448454] env[60788]: DEBUG nova.scheduler.client.report [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 965.464082] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.645s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 965.464918] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 965.507901] env[60788]: DEBUG nova.compute.utils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 965.509394] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 965.510072] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 965.519926] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 965.608475] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 965.641724] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 965.641969] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 965.642194] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 965.642337] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 965.642486] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 965.642639] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 965.642853] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 965.643051] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 965.643238] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 965.643410] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 965.643602] env[60788]: DEBUG nova.virt.hardware [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 965.644771] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ae85f4-21d0-4bd7-9e06-7c19d2f02722 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.653645] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45027177-3b06-43b8-8416-07781cbfa9b7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 965.685549] env[60788]: DEBUG nova.policy [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3cba4db4cb848f1a18d2359d393ae50', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1b40bbdc85154d11be540d7c82b2c79d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 966.662875] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Successfully created port: 125de35e-6104-443a-a493-7ba311a404ab {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 968.408414] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Successfully updated port: 125de35e-6104-443a-a493-7ba311a404ab {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 968.423071] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "refresh_cache-c206be99-2f74-4c28-a008-e6edcccf65bf" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 968.423800] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquired lock "refresh_cache-c206be99-2f74-4c28-a008-e6edcccf65bf" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 968.423800] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 968.515165] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 968.597709] env[60788]: DEBUG nova.compute.manager [req-58b0476b-b04e-402a-aa9e-4cfef8b45c94 req-7d2d4032-6478-45b5-875d-22faa18b7726 service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Received event network-vif-plugged-125de35e-6104-443a-a493-7ba311a404ab {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 968.597924] env[60788]: DEBUG oslo_concurrency.lockutils [req-58b0476b-b04e-402a-aa9e-4cfef8b45c94 req-7d2d4032-6478-45b5-875d-22faa18b7726 service nova] Acquiring lock "c206be99-2f74-4c28-a008-e6edcccf65bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 968.599268] env[60788]: DEBUG oslo_concurrency.lockutils [req-58b0476b-b04e-402a-aa9e-4cfef8b45c94 req-7d2d4032-6478-45b5-875d-22faa18b7726 service nova] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 968.599268] env[60788]: DEBUG oslo_concurrency.lockutils [req-58b0476b-b04e-402a-aa9e-4cfef8b45c94 req-7d2d4032-6478-45b5-875d-22faa18b7726 service nova] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 968.599268] env[60788]: DEBUG nova.compute.manager [req-58b0476b-b04e-402a-aa9e-4cfef8b45c94 req-7d2d4032-6478-45b5-875d-22faa18b7726 service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] No waiting events found dispatching network-vif-plugged-125de35e-6104-443a-a493-7ba311a404ab {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 968.599268] env[60788]: WARNING nova.compute.manager [req-58b0476b-b04e-402a-aa9e-4cfef8b45c94 req-7d2d4032-6478-45b5-875d-22faa18b7726 service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Received unexpected event network-vif-plugged-125de35e-6104-443a-a493-7ba311a404ab for instance with vm_state building and task_state spawning. [ 968.890058] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Updating instance_info_cache with network_info: [{"id": "125de35e-6104-443a-a493-7ba311a404ab", "address": "fa:16:3e:07:eb:c4", "network": {"id": "010b36ca-9470-4570-a6dd-ffad5971f26c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2055711516-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1b40bbdc85154d11be540d7c82b2c79d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d12aff80-9d1b-4a67-a470-9c0148b443e3", "external-id": "nsx-vlan-transportzone-784", "segmentation_id": 784, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap125de35e-61", "ovs_interfaceid": "125de35e-6104-443a-a493-7ba311a404ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 968.906734] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Releasing lock "refresh_cache-c206be99-2f74-4c28-a008-e6edcccf65bf" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 968.907102] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance network_info: |[{"id": "125de35e-6104-443a-a493-7ba311a404ab", "address": "fa:16:3e:07:eb:c4", "network": {"id": "010b36ca-9470-4570-a6dd-ffad5971f26c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2055711516-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1b40bbdc85154d11be540d7c82b2c79d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d12aff80-9d1b-4a67-a470-9c0148b443e3", "external-id": "nsx-vlan-transportzone-784", "segmentation_id": 784, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap125de35e-61", "ovs_interfaceid": "125de35e-6104-443a-a493-7ba311a404ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 968.907798] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:07:eb:c4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd12aff80-9d1b-4a67-a470-9c0148b443e3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '125de35e-6104-443a-a493-7ba311a404ab', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 968.919351] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Creating folder: Project (1b40bbdc85154d11be540d7c82b2c79d). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 968.921375] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-634060a1-b05a-48f0-b7ba-3a6364090c8b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.934904] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Created folder: Project (1b40bbdc85154d11be540d7c82b2c79d) in parent group-v449747. [ 968.935614] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Creating folder: Instances. Parent ref: group-v449806. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 968.935614] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c2c9971f-23d7-46a9-8b33-cbe972d8096d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.951370] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Created folder: Instances in parent group-v449806. [ 968.951596] env[60788]: DEBUG oslo.service.loopingcall [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 968.951788] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 968.951994] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2a81c102-39ae-4f1e-9330-e8367b96c479 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 968.977779] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 968.977779] env[60788]: value = "task-2205192" [ 968.977779] env[60788]: _type = "Task" [ 968.977779] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 968.987199] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205192, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 969.488268] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205192, 'name': CreateVM_Task, 'duration_secs': 0.31847} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 969.488617] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 969.489120] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 969.489294] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 969.489624] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 969.489878] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd49d3c1-7ad3-43a3-8e72-41292a78bbae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 969.494674] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Waiting for the task: (returnval){ [ 969.494674] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52d3a813-482e-e35f-5942-146c5dd6eddb" [ 969.494674] env[60788]: _type = "Task" [ 969.494674] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 969.502588] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52d3a813-482e-e35f-5942-146c5dd6eddb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 970.012336] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 970.012617] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 970.012837] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 970.640065] env[60788]: DEBUG nova.compute.manager [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Received event network-changed-125de35e-6104-443a-a493-7ba311a404ab {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 970.640318] env[60788]: DEBUG nova.compute.manager [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Refreshing instance network info cache due to event network-changed-125de35e-6104-443a-a493-7ba311a404ab. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 970.640481] env[60788]: DEBUG oslo_concurrency.lockutils [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] Acquiring lock "refresh_cache-c206be99-2f74-4c28-a008-e6edcccf65bf" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 970.640620] env[60788]: DEBUG oslo_concurrency.lockutils [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] Acquired lock "refresh_cache-c206be99-2f74-4c28-a008-e6edcccf65bf" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 970.640772] env[60788]: DEBUG nova.network.neutron [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Refreshing network info cache for port 125de35e-6104-443a-a493-7ba311a404ab {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 971.021570] env[60788]: DEBUG nova.network.neutron [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Updated VIF entry in instance network info cache for port 125de35e-6104-443a-a493-7ba311a404ab. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 971.021937] env[60788]: DEBUG nova.network.neutron [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Updating instance_info_cache with network_info: [{"id": "125de35e-6104-443a-a493-7ba311a404ab", "address": "fa:16:3e:07:eb:c4", "network": {"id": "010b36ca-9470-4570-a6dd-ffad5971f26c", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2055711516-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1b40bbdc85154d11be540d7c82b2c79d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d12aff80-9d1b-4a67-a470-9c0148b443e3", "external-id": "nsx-vlan-transportzone-784", "segmentation_id": 784, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap125de35e-61", "ovs_interfaceid": "125de35e-6104-443a-a493-7ba311a404ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.034294] env[60788]: DEBUG oslo_concurrency.lockutils [req-843a83fa-acf0-46c4-ae79-956c82f495ce req-3ad38c92-e440-40eb-9b5c-d904eb1202cb service nova] Releasing lock "refresh_cache-c206be99-2f74-4c28-a008-e6edcccf65bf" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 972.576380] env[60788]: DEBUG oslo_concurrency.lockutils [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "c206be99-2f74-4c28-a008-e6edcccf65bf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.813071] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "28605b2e-9795-47a0-821c-5cf8da077d37" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.813351] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "28605b2e-9795-47a0-821c-5cf8da077d37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 977.849082] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "77e5ef91-47b8-4d27-a899-8f4a910851b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.851081] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "77e5ef91-47b8-4d27-a899-8f4a910851b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 982.609970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f03e83e7-6092-4deb-b19c-e3b730ef6142 tempest-InstanceActionsNegativeTestJSON-758130375 tempest-InstanceActionsNegativeTestJSON-758130375-project-member] Acquiring lock "f4a6ac93-39eb-4a36-93d3-b01150092707" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 982.610350] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f03e83e7-6092-4deb-b19c-e3b730ef6142 tempest-InstanceActionsNegativeTestJSON-758130375 tempest-InstanceActionsNegativeTestJSON-758130375-project-member] Lock "f4a6ac93-39eb-4a36-93d3-b01150092707" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 985.109372] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8e7c8fdf-841f-4b79-8248-a1f51f09a489 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "027da562-4bf7-436d-bd68-af586797587a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 985.109693] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8e7c8fdf-841f-4b79-8248-a1f51f09a489 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "027da562-4bf7-436d-bd68-af586797587a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 990.236128] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e2110a6-fae3-4ad5-966f-e06fad47fd2a tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] Acquiring lock "6489648d-415a-4625-9ac6-7ee30622c8bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 990.236533] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e2110a6-fae3-4ad5-966f-e06fad47fd2a tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] Lock "6489648d-415a-4625-9ac6-7ee30622c8bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 991.268778] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3b9638fa-c424-42c2-b9c7-09452bbc1f0e tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] Acquiring lock "49f64e1c-063b-4483-bd68-7423b72ea4a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 991.269054] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3b9638fa-c424-42c2-b9c7-09452bbc1f0e tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] Lock "49f64e1c-063b-4483-bd68-7423b72ea4a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 998.982134] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edbb33ae-4359-4d10-8cb7-f86cef7c0a78 tempest-ServersNegativeTestMultiTenantJSON-929736961 tempest-ServersNegativeTestMultiTenantJSON-929736961-project-member] Acquiring lock "0ea664eb-1978-4725-b8a5-75ce53f0d165" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 998.982461] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edbb33ae-4359-4d10-8cb7-f86cef7c0a78 tempest-ServersNegativeTestMultiTenantJSON-929736961 tempest-ServersNegativeTestMultiTenantJSON-929736961-project-member] Lock "0ea664eb-1978-4725-b8a5-75ce53f0d165" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.185276] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfda5757-27b3-4205-8301-d99854360996 tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] Acquiring lock "2da6479b-4b3a-4d7d-91cb-81b563b11732" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.185540] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfda5757-27b3-4205-8301-d99854360996 tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] Lock "2da6479b-4b3a-4d7d-91cb-81b563b11732" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1003.335179] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8de8c1bc-e269-4146-a502-f028313daa01 tempest-ServerActionsTestJSON-2104909477 tempest-ServerActionsTestJSON-2104909477-project-member] Acquiring lock "188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1003.335589] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8de8c1bc-e269-4146-a502-f028313daa01 tempest-ServerActionsTestJSON-2104909477 tempest-ServerActionsTestJSON-2104909477-project-member] Lock "188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.810629] env[60788]: DEBUG oslo_concurrency.lockutils [None req-42a0f89b-848a-4c32-9cfe-ae958be870db tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1005.810629] env[60788]: DEBUG oslo_concurrency.lockutils [None req-42a0f89b-848a-4c32-9cfe-ae958be870db tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1009.174437] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d9155b2b-84ad-4132-9a59-af367d8be14a tempest-ServerMetadataNegativeTestJSON-1034376198 tempest-ServerMetadataNegativeTestJSON-1034376198-project-member] Acquiring lock "3265e0a4-28f2-4484-a164-4dc5af01d6ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1009.174745] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d9155b2b-84ad-4132-9a59-af367d8be14a tempest-ServerMetadataNegativeTestJSON-1034376198 tempest-ServerMetadataNegativeTestJSON-1034376198-project-member] Lock "3265e0a4-28f2-4484-a164-4dc5af01d6ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1010.753588] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1012.882456] env[60788]: WARNING oslo_vmware.rw_handles [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1012.882456] env[60788]: ERROR oslo_vmware.rw_handles [ 1012.883125] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1012.884930] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1012.885200] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Copying Virtual Disk [datastore2] vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/0bfdad4c-04ba-452a-a3d8-c726c9cf2746/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1012.885512] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3c62d414-4966-4d2c-b164-1a55261224f3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1012.894715] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1012.894715] env[60788]: value = "task-2205193" [ 1012.894715] env[60788]: _type = "Task" [ 1012.894715] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1012.903329] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205193, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1013.404929] env[60788]: DEBUG oslo_vmware.exceptions [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1013.405268] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1013.405980] env[60788]: ERROR nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1013.405980] env[60788]: Faults: ['InvalidArgument'] [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Traceback (most recent call last): [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] yield resources [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self.driver.spawn(context, instance, image_meta, [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self._fetch_image_if_missing(context, vi) [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1013.405980] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] image_cache(vi, tmp_image_ds_loc) [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] vm_util.copy_virtual_disk( [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] session._wait_for_task(vmdk_copy_task) [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] return self.wait_for_task(task_ref) [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] return evt.wait() [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] result = hub.switch() [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] return self.greenlet.switch() [ 1013.406530] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1013.407201] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self.f(*self.args, **self.kw) [ 1013.407201] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1013.407201] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] raise exceptions.translate_fault(task_info.error) [ 1013.407201] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1013.407201] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Faults: ['InvalidArgument'] [ 1013.407201] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] [ 1013.407201] env[60788]: INFO nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Terminating instance [ 1013.408012] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1013.408236] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1013.408861] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1013.411062] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1013.411062] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3fb5bd5-75f7-4742-a582-70b118a60761 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.411605] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a654df69-5b51-4b53-a792-9ad406e97876 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.418671] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1013.418894] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2f56e212-0c23-4009-b3d9-6d7359b6c9a7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.421175] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1013.421346] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1013.422402] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c54313b-a635-4790-a357-07433c05f258 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.426964] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for the task: (returnval){ [ 1013.426964] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]524d7884-f181-6c80-de90-1eb938707997" [ 1013.426964] env[60788]: _type = "Task" [ 1013.426964] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1013.435035] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]524d7884-f181-6c80-de90-1eb938707997, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1013.487326] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1013.487609] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1013.487800] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleting the datastore file [datastore2] 80e7296f-45ed-4987-9884-05bd883f4144 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1013.488078] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2f03a640-c79b-4d6a-a087-b05289cff7e0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.494196] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1013.494196] env[60788]: value = "task-2205195" [ 1013.494196] env[60788]: _type = "Task" [ 1013.494196] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1013.501677] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205195, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1013.937706] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1013.938019] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Creating directory with path [datastore2] vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1013.938236] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-762307c3-10e1-49d2-b860-6c11a53854ab {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.950427] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Created directory with path [datastore2] vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1013.950624] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Fetch image to [datastore2] vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1013.950794] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1013.951559] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e900fbb-4df4-4b42-8070-61f8623a7175 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.958561] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-018f8d4d-aebc-4eef-8410-f0e285007e43 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1013.967523] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0046155b-3dc4-4c54-9155-9fae0a919fc1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.001636] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d6b3425-15c8-4ce2-868d-16daadb5df78 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.008495] env[60788]: DEBUG oslo_vmware.api [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205195, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080333} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1014.009944] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1014.010150] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1014.010323] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1014.010495] env[60788]: INFO nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1014.012503] env[60788]: DEBUG nova.compute.claims [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1014.012667] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.012877] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1014.015268] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e5ef9e7c-2ff5-425f-b508-46143910ce5c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.037554] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1014.094942] env[60788]: DEBUG oslo_vmware.rw_handles [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1014.155498] env[60788]: DEBUG oslo_vmware.rw_handles [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1014.155658] env[60788]: DEBUG oslo_vmware.rw_handles [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1014.434397] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-095f8ed3-06bb-4090-b79a-7c0ce8e0d054 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.441919] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f24b664-35cc-4c4e-9fd4-252a59efeee8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.472276] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-086f8bd1-52ba-45da-8a8b-d2c4bc5091ec {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.479441] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb3a668-2c86-4ce8-868a-72e062676eff {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1014.492361] env[60788]: DEBUG nova.compute.provider_tree [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1014.500850] env[60788]: DEBUG nova.scheduler.client.report [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1014.515532] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.503s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1014.516085] env[60788]: ERROR nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1014.516085] env[60788]: Faults: ['InvalidArgument'] [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Traceback (most recent call last): [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self.driver.spawn(context, instance, image_meta, [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self._fetch_image_if_missing(context, vi) [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] image_cache(vi, tmp_image_ds_loc) [ 1014.516085] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] vm_util.copy_virtual_disk( [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] session._wait_for_task(vmdk_copy_task) [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] return self.wait_for_task(task_ref) [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] return evt.wait() [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] result = hub.switch() [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] return self.greenlet.switch() [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1014.516418] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] self.f(*self.args, **self.kw) [ 1014.516774] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1014.516774] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] raise exceptions.translate_fault(task_info.error) [ 1014.516774] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1014.516774] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Faults: ['InvalidArgument'] [ 1014.516774] env[60788]: ERROR nova.compute.manager [instance: 80e7296f-45ed-4987-9884-05bd883f4144] [ 1014.516774] env[60788]: DEBUG nova.compute.utils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1014.518204] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Build of instance 80e7296f-45ed-4987-9884-05bd883f4144 was re-scheduled: A specified parameter was not correct: fileType [ 1014.518204] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1014.518592] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1014.518777] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1014.518952] env[60788]: DEBUG nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1014.519203] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1014.753595] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1014.753775] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1014.954212] env[60788]: DEBUG nova.network.neutron [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1014.964702] env[60788]: INFO nova.compute.manager [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Took 0.45 seconds to deallocate network for instance. [ 1015.053932] env[60788]: INFO nova.scheduler.client.report [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted allocations for instance 80e7296f-45ed-4987-9884-05bd883f4144 [ 1015.078605] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53492b76-3bc3-48ca-9040-45afd8a88683 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "80e7296f-45ed-4987-9884-05bd883f4144" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 479.605s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.079908] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "80e7296f-45ed-4987-9884-05bd883f4144" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 279.434s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.079908] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "80e7296f-45ed-4987-9884-05bd883f4144-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.080979] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "80e7296f-45ed-4987-9884-05bd883f4144-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.080979] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "80e7296f-45ed-4987-9884-05bd883f4144-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.082139] env[60788]: INFO nova.compute.manager [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Terminating instance [ 1015.083806] env[60788]: DEBUG nova.compute.manager [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1015.083991] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1015.084464] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7e327d79-4022-404b-ae5d-218517a07dc3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.093556] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0effb33a-f5f3-474b-9e85-99191757397d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.104632] env[60788]: DEBUG nova.compute.manager [None req-fc197db7-f494-4fbe-8055-4da203d1be00 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] [instance: d136a94d-344a-4697-97b5-3d732a16f4a0] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.126887] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 80e7296f-45ed-4987-9884-05bd883f4144 could not be found. [ 1015.127112] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1015.127294] env[60788]: INFO nova.compute.manager [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1015.127567] env[60788]: DEBUG oslo.service.loopingcall [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1015.127791] env[60788]: DEBUG nova.compute.manager [-] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1015.127891] env[60788]: DEBUG nova.network.neutron [-] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1015.130232] env[60788]: DEBUG nova.compute.manager [None req-fc197db7-f494-4fbe-8055-4da203d1be00 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] [instance: d136a94d-344a-4697-97b5-3d732a16f4a0] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.151278] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fc197db7-f494-4fbe-8055-4da203d1be00 tempest-MigrationsAdminTest-638304573 tempest-MigrationsAdminTest-638304573-project-member] Lock "d136a94d-344a-4697-97b5-3d732a16f4a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.304s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.161132] env[60788]: DEBUG nova.compute.manager [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] [instance: 1ae1eb4b-4696-4592-a758-79b2211d35c0] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.175057] env[60788]: DEBUG nova.network.neutron [-] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1015.205620] env[60788]: INFO nova.compute.manager [-] [instance: 80e7296f-45ed-4987-9884-05bd883f4144] Took 0.08 seconds to deallocate network for instance. [ 1015.206056] env[60788]: DEBUG nova.compute.manager [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] [instance: 1ae1eb4b-4696-4592-a758-79b2211d35c0] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.227713] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Lock "1ae1eb4b-4696-4592-a758-79b2211d35c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.005s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.239841] env[60788]: DEBUG nova.compute.manager [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] [instance: 4bfbddb3-f66c-4059-8624-654e180ab997] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.261768] env[60788]: DEBUG nova.compute.manager [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] [instance: 4bfbddb3-f66c-4059-8624-654e180ab997] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.286215] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Lock "4bfbddb3-f66c-4059-8624-654e180ab997" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.031s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.301395] env[60788]: DEBUG nova.compute.manager [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] [instance: 5ebb3604-792d-4fd7-95e6-d8a826c2d50a] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.312327] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f908f480-2304-413b-b8df-ec6683b922d0 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "80e7296f-45ed-4987-9884-05bd883f4144" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.233s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.327390] env[60788]: DEBUG nova.compute.manager [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] [instance: 5ebb3604-792d-4fd7-95e6-d8a826c2d50a] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.346857] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e80595be-ade1-468a-9204-ab91f73c70c7 tempest-ListServersNegativeTestJSON-1533122114 tempest-ListServersNegativeTestJSON-1533122114-project-member] Lock "5ebb3604-792d-4fd7-95e6-d8a826c2d50a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.055s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.356831] env[60788]: DEBUG nova.compute.manager [None req-5e854664-c4b9-4f68-b7b1-cb373930a57f tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: b7b0591b-123b-49ad-8ab6-6881d0c7888b] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.380214] env[60788]: DEBUG nova.compute.manager [None req-5e854664-c4b9-4f68-b7b1-cb373930a57f tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: b7b0591b-123b-49ad-8ab6-6881d0c7888b] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.400476] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5e854664-c4b9-4f68-b7b1-cb373930a57f tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "b7b0591b-123b-49ad-8ab6-6881d0c7888b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.655s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.410929] env[60788]: DEBUG nova.compute.manager [None req-e9757ac4-1971-433c-94b6-35ae5be9a558 tempest-ServerTagsTestJSON-1708565971 tempest-ServerTagsTestJSON-1708565971-project-member] [instance: 71ac0cb5-ebac-4f22-897c-1742b5416fca] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.434354] env[60788]: DEBUG nova.compute.manager [None req-e9757ac4-1971-433c-94b6-35ae5be9a558 tempest-ServerTagsTestJSON-1708565971 tempest-ServerTagsTestJSON-1708565971-project-member] [instance: 71ac0cb5-ebac-4f22-897c-1742b5416fca] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.456312] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e9757ac4-1971-433c-94b6-35ae5be9a558 tempest-ServerTagsTestJSON-1708565971 tempest-ServerTagsTestJSON-1708565971-project-member] Lock "71ac0cb5-ebac-4f22-897c-1742b5416fca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.905s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.465377] env[60788]: DEBUG nova.compute.manager [None req-463b86de-7bbc-4955-a233-560623237ed5 tempest-ImagesOneServerNegativeTestJSON-1867104132 tempest-ImagesOneServerNegativeTestJSON-1867104132-project-member] [instance: 55ccb77a-7c54-4e4f-a665-43dc1c30e595] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.488223] env[60788]: DEBUG nova.compute.manager [None req-463b86de-7bbc-4955-a233-560623237ed5 tempest-ImagesOneServerNegativeTestJSON-1867104132 tempest-ImagesOneServerNegativeTestJSON-1867104132-project-member] [instance: 55ccb77a-7c54-4e4f-a665-43dc1c30e595] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.508560] env[60788]: DEBUG oslo_concurrency.lockutils [None req-463b86de-7bbc-4955-a233-560623237ed5 tempest-ImagesOneServerNegativeTestJSON-1867104132 tempest-ImagesOneServerNegativeTestJSON-1867104132-project-member] Lock "55ccb77a-7c54-4e4f-a665-43dc1c30e595" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.928s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.518518] env[60788]: DEBUG nova.compute.manager [None req-91c99aef-65a8-4b70-b938-eeab8a6850ca tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] [instance: 71acb134-101c-482b-9e5f-bbc18b8e01d7] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.543390] env[60788]: DEBUG nova.compute.manager [None req-91c99aef-65a8-4b70-b938-eeab8a6850ca tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] [instance: 71acb134-101c-482b-9e5f-bbc18b8e01d7] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.564224] env[60788]: DEBUG oslo_concurrency.lockutils [None req-91c99aef-65a8-4b70-b938-eeab8a6850ca tempest-AttachVolumeShelveTestJSON-1571779989 tempest-AttachVolumeShelveTestJSON-1571779989-project-member] Lock "71acb134-101c-482b-9e5f-bbc18b8e01d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.455s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.575870] env[60788]: DEBUG nova.compute.manager [None req-3f263ee7-2870-4d1f-8d87-c94368b52fe5 tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] [instance: 9da7df70-e116-4bf9-83fb-626208162b27] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.600459] env[60788]: DEBUG nova.compute.manager [None req-3f263ee7-2870-4d1f-8d87-c94368b52fe5 tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] [instance: 9da7df70-e116-4bf9-83fb-626208162b27] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.621564] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3f263ee7-2870-4d1f-8d87-c94368b52fe5 tempest-SecurityGroupsTestJSON-1376363831 tempest-SecurityGroupsTestJSON-1376363831-project-member] Lock "9da7df70-e116-4bf9-83fb-626208162b27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.813s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.630900] env[60788]: DEBUG nova.compute.manager [None req-8208d9af-1af9-4e0a-ac46-6b181f238d79 tempest-ImagesNegativeTestJSON-1147299362 tempest-ImagesNegativeTestJSON-1147299362-project-member] [instance: f39ca342-04ed-45d7-8017-717d3a9ba244] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.684324] env[60788]: DEBUG nova.compute.manager [None req-8208d9af-1af9-4e0a-ac46-6b181f238d79 tempest-ImagesNegativeTestJSON-1147299362 tempest-ImagesNegativeTestJSON-1147299362-project-member] [instance: f39ca342-04ed-45d7-8017-717d3a9ba244] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.706270] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8208d9af-1af9-4e0a-ac46-6b181f238d79 tempest-ImagesNegativeTestJSON-1147299362 tempest-ImagesNegativeTestJSON-1147299362-project-member] Lock "f39ca342-04ed-45d7-8017-717d3a9ba244" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.506s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.715601] env[60788]: DEBUG nova.compute.manager [None req-0c58645f-7047-4da1-a24f-db229131d350 tempest-ServerShowV254Test-617035683 tempest-ServerShowV254Test-617035683-project-member] [instance: b4004d4f-8a7f-42be-9ce4-5ab53ae62f78] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.740067] env[60788]: DEBUG nova.compute.manager [None req-0c58645f-7047-4da1-a24f-db229131d350 tempest-ServerShowV254Test-617035683 tempest-ServerShowV254Test-617035683-project-member] [instance: b4004d4f-8a7f-42be-9ce4-5ab53ae62f78] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.748738] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1015.753605] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1015.754655] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1015.754655] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1015.775500] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.775718] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.775995] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.776230] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.776366] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.776528] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.776693] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.776851] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.776999] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1015.777189] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1015.777964] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1015.780243] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1015.781187] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1015.788819] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c58645f-7047-4da1-a24f-db229131d350 tempest-ServerShowV254Test-617035683 tempest-ServerShowV254Test-617035683-project-member] Lock "b4004d4f-8a7f-42be-9ce4-5ab53ae62f78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.582s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.794367] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.794692] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.794972] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.795238] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1015.797042] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d92304-360b-4324-b505-d63c4e358dc6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.801097] env[60788]: DEBUG nova.compute.manager [None req-3028fffc-75c6-4bda-a932-62061e55af94 tempest-ServerActionsV293TestJSON-413592385 tempest-ServerActionsV293TestJSON-413592385-project-member] [instance: 822c4411-1759-4d9e-820a-5d617fdd2488] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.812838] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b492ba08-7256-46b7-ae39-552ed0c20247 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.830990] env[60788]: DEBUG nova.compute.manager [None req-3028fffc-75c6-4bda-a932-62061e55af94 tempest-ServerActionsV293TestJSON-413592385 tempest-ServerActionsV293TestJSON-413592385-project-member] [instance: 822c4411-1759-4d9e-820a-5d617fdd2488] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.832902] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e95747eb-a27e-481d-9c38-23883d89c0b5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.841158] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2f8c6a8-f1de-4e7a-920d-cef2443c39c8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1015.873659] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181255MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1015.873823] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.874075] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.876533] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3028fffc-75c6-4bda-a932-62061e55af94 tempest-ServerActionsV293TestJSON-413592385 tempest-ServerActionsV293TestJSON-413592385-project-member] Lock "822c4411-1759-4d9e-820a-5d617fdd2488" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.290s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.891144] env[60788]: DEBUG nova.compute.manager [None req-a6cbf59b-5cee-4a47-882a-f105b92835ec tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 61d9ecf8-0ed5-4451-9953-e53cabecf36b] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.925888] env[60788]: DEBUG nova.compute.manager [None req-a6cbf59b-5cee-4a47-882a-f105b92835ec tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 61d9ecf8-0ed5-4451-9953-e53cabecf36b] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1015.945811] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6cbf59b-5cee-4a47-882a-f105b92835ec tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "61d9ecf8-0ed5-4451-9953-e53cabecf36b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.076s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1015.947774] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.947916] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948062] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948192] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948315] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948433] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948549] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948663] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.948775] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1015.954777] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1015.959114] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1015.968365] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1015.978839] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1015.988044] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance cceddbb3-f076-4b72-882f-71432f8f0a81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1015.997225] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2a45087b-e101-49fd-b102-abb56b8b88e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.005941] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b9ce0d5b-0ee9-4585-9265-10a96ea62752 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.015023] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1016.015666] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3da33ce7-b346-4970-b4ab-36a74c67d3dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.025983] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 854ad83b-7e4d-4f0d-b6d9-4e9492fc3461 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.035540] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.046341] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 77e5ef91-47b8-4d27-a899-8f4a910851b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.056609] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f4a6ac93-39eb-4a36-93d3-b01150092707 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.065307] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 027da562-4bf7-436d-bd68-af586797587a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.074096] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6489648d-415a-4625-9ac6-7ee30622c8bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.082883] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 49f64e1c-063b-4483-bd68-7423b72ea4a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.091906] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0ea664eb-1978-4725-b8a5-75ce53f0d165 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.100923] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2da6479b-4b3a-4d7d-91cb-81b563b11732 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.109742] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.118922] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.128366] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3265e0a4-28f2-4484-a164-4dc5af01d6ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1016.128617] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1016.128775] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1016.500590] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67d031ea-f7d4-4918-98ca-a80cb59895cf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.508255] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe43333c-c2a1-4416-9ea0-0f6e2d4b6edd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.539378] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8efaa85e-30e8-4dd9-8a0b-b53d93bac9cb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.547011] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35628905-4b07-4d67-8224-8b17bbd3c10a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.560423] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1016.568660] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1016.581580] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1016.581763] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1016.582045] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.567s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1016.583481] env[60788]: INFO nova.compute.claims [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1016.951155] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a53716d-6bbe-44fb-9c0d-e21e80cf742f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.959009] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12280b0f-a7db-474a-86b7-6478f8902384 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.989867] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18aeb279-73b9-4fd6-8e46-4347d67767e2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1016.996946] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e029ae6-9bf4-4e1d-b362-6dcfaab5703e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.009945] env[60788]: DEBUG nova.compute.provider_tree [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1017.017852] env[60788]: DEBUG nova.scheduler.client.report [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1017.030376] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1017.031045] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1017.061803] env[60788]: DEBUG nova.compute.utils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1017.063863] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Not allocating networking since 'none' was specified. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 1017.071550] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1017.133621] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1017.159146] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1017.159481] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1017.159694] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1017.160554] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1017.160554] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1017.160554] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1017.160554] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1017.160554] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1017.160800] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1017.161074] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1017.161292] env[60788]: DEBUG nova.virt.hardware [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1017.162207] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-116b67c3-20ec-48df-9a8b-a35b166169c4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.170366] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03fabb8b-bc42-4945-a4ef-5b301c891da3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.183782] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance VIF info [] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1017.189444] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Creating folder: Project (d80e223e192c4f218fe8d31f49170823). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1017.189699] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f8060337-ad7d-4da8-aed6-c2d24a8f2fa1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.199460] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Created folder: Project (d80e223e192c4f218fe8d31f49170823) in parent group-v449747. [ 1017.199602] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Creating folder: Instances. Parent ref: group-v449809. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1017.199821] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0636b53a-4990-4d04-91d1-28b950a3a9bc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.208773] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Created folder: Instances in parent group-v449809. [ 1017.209641] env[60788]: DEBUG oslo.service.loopingcall [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1017.209641] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1017.209641] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9e11b50b-913b-467c-934b-5663451d7e15 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.225791] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1017.225791] env[60788]: value = "task-2205198" [ 1017.225791] env[60788]: _type = "Task" [ 1017.225791] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1017.232965] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205198, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1017.736318] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205198, 'name': CreateVM_Task, 'duration_secs': 0.232276} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1017.736500] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1017.736920] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1017.737095] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1017.737424] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1017.737729] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd298bc8-6bce-4952-9ab0-a46297e67826 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1017.742316] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for the task: (returnval){ [ 1017.742316] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]528a8ce2-441e-7ae5-ed4f-b1c2c3c9dce2" [ 1017.742316] env[60788]: _type = "Task" [ 1017.742316] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1017.749804] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]528a8ce2-441e-7ae5-ed4f-b1c2c3c9dce2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1018.252355] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1018.252644] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1018.252813] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1018.558680] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1019.754481] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1020.096347] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1041.294840] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "f7fa5c24-7ff5-4656-897f-b0164c989207" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1041.294840] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.805425] env[60788]: WARNING oslo_vmware.rw_handles [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1061.805425] env[60788]: ERROR oslo_vmware.rw_handles [ 1061.806090] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1061.808350] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1061.808699] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Copying Virtual Disk [datastore2] vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/b37ede69-f57d-4a4b-bd45-3666570d351d/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1061.809019] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5310bbf5-7bee-41c3-9aa8-f6c211ff6a1d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1061.817703] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for the task: (returnval){ [ 1061.817703] env[60788]: value = "task-2205199" [ 1061.817703] env[60788]: _type = "Task" [ 1061.817703] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1061.826615] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Task: {'id': task-2205199, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.328770] env[60788]: DEBUG oslo_vmware.exceptions [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1062.330054] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1062.330054] env[60788]: ERROR nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1062.330054] env[60788]: Faults: ['InvalidArgument'] [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Traceback (most recent call last): [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] yield resources [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self.driver.spawn(context, instance, image_meta, [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1062.330054] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self._fetch_image_if_missing(context, vi) [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] image_cache(vi, tmp_image_ds_loc) [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] vm_util.copy_virtual_disk( [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] session._wait_for_task(vmdk_copy_task) [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] return self.wait_for_task(task_ref) [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] return evt.wait() [ 1062.330655] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] result = hub.switch() [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] return self.greenlet.switch() [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self.f(*self.args, **self.kw) [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] raise exceptions.translate_fault(task_info.error) [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Faults: ['InvalidArgument'] [ 1062.331380] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] [ 1062.331380] env[60788]: INFO nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Terminating instance [ 1062.332419] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.332688] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1062.332973] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52ae7e02-82b5-4e86-9e2c-d118cd33d1e4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.335256] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.335498] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquired lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.335663] env[60788]: DEBUG nova.network.neutron [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1062.343064] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1062.343714] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1062.343994] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bc040411-6c27-4c92-a017-7db6773c33a1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.352053] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 1062.352053] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52f8cd6f-0d4a-febb-1d33-4442cdc2d901" [ 1062.352053] env[60788]: _type = "Task" [ 1062.352053] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.359985] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52f8cd6f-0d4a-febb-1d33-4442cdc2d901, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.367234] env[60788]: DEBUG nova.network.neutron [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1062.469873] env[60788]: DEBUG nova.network.neutron [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1062.479815] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Releasing lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1062.480246] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1062.480437] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1062.481580] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b363579b-c04d-4958-aa8e-a88f6425f49e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.489859] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1062.490121] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d3f629b7-2502-460f-8f55-5fdce4ac4c65 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.521585] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1062.521865] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1062.521987] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Deleting the datastore file [datastore2] 16c2dc56-0095-437a-942f-fcfd49c3e8f3 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1062.522334] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1f732605-5889-4029-a480-85a6e399583c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.528442] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for the task: (returnval){ [ 1062.528442] env[60788]: value = "task-2205201" [ 1062.528442] env[60788]: _type = "Task" [ 1062.528442] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.536638] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Task: {'id': task-2205201, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.863072] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1062.863377] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating directory with path [datastore2] vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1062.863602] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6806c8b1-a400-424d-b2c6-ed01dad23cee {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.875394] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Created directory with path [datastore2] vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1062.875645] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Fetch image to [datastore2] vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1062.875781] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1062.876506] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d872117-be3b-42dc-acf2-ccb19c05de1a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.884645] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed2b42bf-05ef-4a0a-9d70-94810b73f561 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.893431] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60a73de4-1ae3-40fa-831c-b467acf763da {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.925181] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c359ce9-c8d6-45a0-9985-07a8964cd6b7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.931060] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-62b7fc57-c9e1-4c7a-a3f7-c3fe4bf901c1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.955199] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1063.007828] env[60788]: DEBUG oslo_vmware.rw_handles [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1063.072924] env[60788]: DEBUG oslo_vmware.api [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Task: {'id': task-2205201, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042568} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1063.073445] env[60788]: DEBUG oslo_vmware.rw_handles [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1063.073609] env[60788]: DEBUG oslo_vmware.rw_handles [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1063.073893] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1063.074082] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1063.074255] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1063.074426] env[60788]: INFO nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1063.074652] env[60788]: DEBUG oslo.service.loopingcall [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1063.074903] env[60788]: DEBUG nova.compute.manager [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1063.077098] env[60788]: DEBUG nova.compute.claims [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1063.077277] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1063.077572] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1063.493845] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bf7ec87-a010-4549-870b-6c774b9f6358 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.502464] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-453fc8d4-dbfd-4ab8-971a-2548570b6331 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.533284] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e1592c9-c4af-46ca-802f-2be7c51f6c8a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.540483] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f24a8b87-8d8d-4045-b8b2-cfb4dadff297 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.554412] env[60788]: DEBUG nova.compute.provider_tree [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1063.562620] env[60788]: DEBUG nova.scheduler.client.report [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1063.578021] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.499s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1063.578021] env[60788]: ERROR nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1063.578021] env[60788]: Faults: ['InvalidArgument'] [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Traceback (most recent call last): [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self.driver.spawn(context, instance, image_meta, [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1063.578021] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self._fetch_image_if_missing(context, vi) [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] image_cache(vi, tmp_image_ds_loc) [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] vm_util.copy_virtual_disk( [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] session._wait_for_task(vmdk_copy_task) [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] return self.wait_for_task(task_ref) [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] return evt.wait() [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] result = hub.switch() [ 1063.578368] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] return self.greenlet.switch() [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] self.f(*self.args, **self.kw) [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] raise exceptions.translate_fault(task_info.error) [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Faults: ['InvalidArgument'] [ 1063.578706] env[60788]: ERROR nova.compute.manager [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] [ 1063.578706] env[60788]: DEBUG nova.compute.utils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1063.579685] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Build of instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 was re-scheduled: A specified parameter was not correct: fileType [ 1063.579685] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1063.580071] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1063.580330] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1063.580538] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquired lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1063.580713] env[60788]: DEBUG nova.network.neutron [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1063.605513] env[60788]: DEBUG nova.network.neutron [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1063.681988] env[60788]: DEBUG nova.network.neutron [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1063.691080] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Releasing lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1063.691307] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1063.691490] env[60788]: DEBUG nova.compute.manager [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1063.783258] env[60788]: INFO nova.scheduler.client.report [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Deleted allocations for instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 [ 1063.802435] env[60788]: DEBUG oslo_concurrency.lockutils [None req-86dc9d7c-582e-4f7c-9b4d-7ca12368d5f1 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 526.051s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1063.803429] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 327.113s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1063.803649] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1063.803852] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1063.804036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1063.805901] env[60788]: INFO nova.compute.manager [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Terminating instance [ 1063.807359] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquiring lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1063.807541] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Acquired lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1063.807718] env[60788]: DEBUG nova.network.neutron [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1063.816680] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1063.844044] env[60788]: DEBUG nova.network.neutron [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1063.871906] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1063.872217] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1063.874053] env[60788]: INFO nova.compute.claims [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1063.908176] env[60788]: DEBUG nova.network.neutron [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1063.922685] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Releasing lock "refresh_cache-16c2dc56-0095-437a-942f-fcfd49c3e8f3" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1063.922806] env[60788]: DEBUG nova.compute.manager [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1063.922930] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1063.924240] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e3e2ab84-c60b-446e-96fc-d8526aa01a45 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.933312] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09cbbf50-d36b-49fe-a33b-e7c9fa7294a6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.964585] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 16c2dc56-0095-437a-942f-fcfd49c3e8f3 could not be found. [ 1063.964797] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1063.965065] env[60788]: INFO nova.compute.manager [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1063.965241] env[60788]: DEBUG oslo.service.loopingcall [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1063.967703] env[60788]: DEBUG nova.compute.manager [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1063.967801] env[60788]: DEBUG nova.network.neutron [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1063.999059] env[60788]: DEBUG nova.network.neutron [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1064.008342] env[60788]: DEBUG nova.network.neutron [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1064.020055] env[60788]: INFO nova.compute.manager [-] [instance: 16c2dc56-0095-437a-942f-fcfd49c3e8f3] Took 0.05 seconds to deallocate network for instance. [ 1064.119772] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5c8a0813-c08b-4247-8445-74113660a749 tempest-ServerDiagnosticsV248Test-121523375 tempest-ServerDiagnosticsV248Test-121523375-project-member] Lock "16c2dc56-0095-437a-942f-fcfd49c3e8f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.316s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1064.274258] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a942a8af-cd37-4783-831d-d2a9bce7196c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.281721] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d84666a-100c-40c4-ab8a-9cecaad2599d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.311212] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0392be9a-eaea-4e82-82eb-197b5751a931 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.318471] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7f0db0e-4c98-43c3-b90e-90ec2557f044 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.332460] env[60788]: DEBUG nova.compute.provider_tree [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1064.341488] env[60788]: DEBUG nova.scheduler.client.report [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1064.354000] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.482s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1064.354473] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1064.387099] env[60788]: DEBUG nova.compute.utils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1064.388736] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1064.388911] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1064.396989] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1064.442911] env[60788]: DEBUG nova.policy [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9da3b76a33a84170b90143cce29541c8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a4d45893a2e94234b47587f527d97a83', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1064.462495] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1064.488049] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1064.488293] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1064.488450] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1064.488645] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1064.488819] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1064.488965] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1064.489147] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1064.489309] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1064.489473] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1064.489636] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1064.489806] env[60788]: DEBUG nova.virt.hardware [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1064.490654] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba3c431-e02d-4a0a-8204-be6a0b012491 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.499093] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d1dd297-fd3a-4238-83fc-e7c571a1b775 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1064.908206] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Successfully created port: 64493197-bde5-4229-bfc5-d537113b4b3b {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1065.887642] env[60788]: DEBUG nova.compute.manager [req-9d0310ff-1bfd-4156-b892-995172fd1837 req-e79197e4-dd51-4920-b96b-bd7daa6dba52 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Received event network-vif-plugged-64493197-bde5-4229-bfc5-d537113b4b3b {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1065.887889] env[60788]: DEBUG oslo_concurrency.lockutils [req-9d0310ff-1bfd-4156-b892-995172fd1837 req-e79197e4-dd51-4920-b96b-bd7daa6dba52 service nova] Acquiring lock "e5084b03-325e-40db-9ffc-0467d53adf38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1065.888101] env[60788]: DEBUG oslo_concurrency.lockutils [req-9d0310ff-1bfd-4156-b892-995172fd1837 req-e79197e4-dd51-4920-b96b-bd7daa6dba52 service nova] Lock "e5084b03-325e-40db-9ffc-0467d53adf38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1065.888269] env[60788]: DEBUG oslo_concurrency.lockutils [req-9d0310ff-1bfd-4156-b892-995172fd1837 req-e79197e4-dd51-4920-b96b-bd7daa6dba52 service nova] Lock "e5084b03-325e-40db-9ffc-0467d53adf38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1065.888482] env[60788]: DEBUG nova.compute.manager [req-9d0310ff-1bfd-4156-b892-995172fd1837 req-e79197e4-dd51-4920-b96b-bd7daa6dba52 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] No waiting events found dispatching network-vif-plugged-64493197-bde5-4229-bfc5-d537113b4b3b {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1065.888679] env[60788]: WARNING nova.compute.manager [req-9d0310ff-1bfd-4156-b892-995172fd1837 req-e79197e4-dd51-4920-b96b-bd7daa6dba52 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Received unexpected event network-vif-plugged-64493197-bde5-4229-bfc5-d537113b4b3b for instance with vm_state building and task_state spawning. [ 1065.968535] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Successfully updated port: 64493197-bde5-4229-bfc5-d537113b4b3b {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1065.979179] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "refresh_cache-e5084b03-325e-40db-9ffc-0467d53adf38" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1065.979333] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquired lock "refresh_cache-e5084b03-325e-40db-9ffc-0467d53adf38" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1065.979482] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1066.047058] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1066.231895] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Updating instance_info_cache with network_info: [{"id": "64493197-bde5-4229-bfc5-d537113b4b3b", "address": "fa:16:3e:5c:53:08", "network": {"id": "dcbfe7fa-d4e5-4897-95ec-c4af61d8c745", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-249186245-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a4d45893a2e94234b47587f527d97a83", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "163e60bd-32d6-41c5-95e6-2eb10c5c9245", "external-id": "nsx-vlan-transportzone-716", "segmentation_id": 716, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64493197-bd", "ovs_interfaceid": "64493197-bde5-4229-bfc5-d537113b4b3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1066.246797] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Releasing lock "refresh_cache-e5084b03-325e-40db-9ffc-0467d53adf38" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1066.247108] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance network_info: |[{"id": "64493197-bde5-4229-bfc5-d537113b4b3b", "address": "fa:16:3e:5c:53:08", "network": {"id": "dcbfe7fa-d4e5-4897-95ec-c4af61d8c745", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-249186245-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a4d45893a2e94234b47587f527d97a83", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "163e60bd-32d6-41c5-95e6-2eb10c5c9245", "external-id": "nsx-vlan-transportzone-716", "segmentation_id": 716, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64493197-bd", "ovs_interfaceid": "64493197-bde5-4229-bfc5-d537113b4b3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1066.247548] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5c:53:08', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '163e60bd-32d6-41c5-95e6-2eb10c5c9245', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '64493197-bde5-4229-bfc5-d537113b4b3b', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1066.255457] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Creating folder: Project (a4d45893a2e94234b47587f527d97a83). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1066.256016] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7ca95a2-40d1-4df1-8a04-56c6903ec5b5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.267207] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Created folder: Project (a4d45893a2e94234b47587f527d97a83) in parent group-v449747. [ 1066.267395] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Creating folder: Instances. Parent ref: group-v449812. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1066.267677] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-161db309-8710-4bf1-96f4-c181bd699645 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.276717] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Created folder: Instances in parent group-v449812. [ 1066.276942] env[60788]: DEBUG oslo.service.loopingcall [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1066.278105] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1066.278105] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b55a5c4c-18f0-4171-bfe1-7418c468aa02 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.296421] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1066.296421] env[60788]: value = "task-2205204" [ 1066.296421] env[60788]: _type = "Task" [ 1066.296421] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1066.304561] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205204, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1066.805646] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205204, 'name': CreateVM_Task, 'duration_secs': 0.295754} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1066.805823] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1066.806499] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1066.806665] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1066.806970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1066.807235] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-08037db5-f3f3-4f3d-8bb3-790d28e13c8d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1066.811905] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Waiting for the task: (returnval){ [ 1066.811905] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]529b1e34-be7e-acc3-c19b-51eebfe72e9d" [ 1066.811905] env[60788]: _type = "Task" [ 1066.811905] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1066.819227] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]529b1e34-be7e-acc3-c19b-51eebfe72e9d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1067.322800] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1067.323118] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1067.323378] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1067.945951] env[60788]: DEBUG nova.compute.manager [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Received event network-changed-64493197-bde5-4229-bfc5-d537113b4b3b {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1067.946242] env[60788]: DEBUG nova.compute.manager [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Refreshing instance network info cache due to event network-changed-64493197-bde5-4229-bfc5-d537113b4b3b. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1067.946425] env[60788]: DEBUG oslo_concurrency.lockutils [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] Acquiring lock "refresh_cache-e5084b03-325e-40db-9ffc-0467d53adf38" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1067.946556] env[60788]: DEBUG oslo_concurrency.lockutils [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] Acquired lock "refresh_cache-e5084b03-325e-40db-9ffc-0467d53adf38" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1067.946718] env[60788]: DEBUG nova.network.neutron [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Refreshing network info cache for port 64493197-bde5-4229-bfc5-d537113b4b3b {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1068.182808] env[60788]: DEBUG nova.network.neutron [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Updated VIF entry in instance network info cache for port 64493197-bde5-4229-bfc5-d537113b4b3b. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1068.183192] env[60788]: DEBUG nova.network.neutron [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Updating instance_info_cache with network_info: [{"id": "64493197-bde5-4229-bfc5-d537113b4b3b", "address": "fa:16:3e:5c:53:08", "network": {"id": "dcbfe7fa-d4e5-4897-95ec-c4af61d8c745", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-249186245-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a4d45893a2e94234b47587f527d97a83", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "163e60bd-32d6-41c5-95e6-2eb10c5c9245", "external-id": "nsx-vlan-transportzone-716", "segmentation_id": 716, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64493197-bd", "ovs_interfaceid": "64493197-bde5-4229-bfc5-d537113b4b3b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1068.193048] env[60788]: DEBUG oslo_concurrency.lockutils [req-e6ebc699-0baa-4b58-ac03-dd8d66ac2758 req-d0bc88c8-e8cc-48e4-bb35-a35f50c7cef6 service nova] Releasing lock "refresh_cache-e5084b03-325e-40db-9ffc-0467d53adf38" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1072.270279] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "e5084b03-325e-40db-9ffc-0467d53adf38" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1072.753280] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1073.749279] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.708638] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.729233] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Getting list of instances from cluster (obj){ [ 1075.729233] env[60788]: value = "domain-c8" [ 1075.729233] env[60788]: _type = "ClusterComputeResource" [ 1075.729233] env[60788]: } {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1075.730741] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cead85c-ab92-4f1a-ba3e-48ea18030942 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.749020] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Got total of 10 instances {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1075.749196] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid aa3bf189-1b7a-40eb-a270-711920dd84a6 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.749399] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 0259d811-2677-4164-94cd-5c4f5d935f50 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.749562] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid d0480645-be38-48de-9ae5-05c4eb0bf5d3 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.749720] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 01821598-4692-440b-8128-c50e359386e2 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.749877] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid fe6168fd-528f-4acb-a44c-6d0b69cada6e {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.750042] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.750197] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 5c7c0b6d-d4ea-4c78-8a76-934859d6571e {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.750345] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid c206be99-2f74-4c28-a008-e6edcccf65bf {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.750494] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid a9c14682-d6d7-43a0-b489-bd3f01a5cc17 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.750640] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid e5084b03-325e-40db-9ffc-0467d53adf38 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1075.750967] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.751227] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "0259d811-2677-4164-94cd-5c4f5d935f50" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.751432] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.751627] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "01821598-4692-440b-8128-c50e359386e2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.751817] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.752014] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.752218] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.752406] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "c206be99-2f74-4c28-a008-e6edcccf65bf" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.752596] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.752832] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "e5084b03-325e-40db-9ffc-0467d53adf38" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.752960] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.753105] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1075.753277] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.753401] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances with incomplete migration {{(pid=60788) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1075.800989] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.801400] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1075.802038] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1075.802038] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1075.821289] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.821486] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.821588] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.821712] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.821834] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.821952] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.822083] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.822203] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.822332] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.822448] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1075.822593] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1077.753474] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1077.753796] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1077.753938] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1077.772405] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1077.772637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1077.772805] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1077.772960] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1077.774107] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8a19bc-5e65-4e84-abc7-348b6b5aed4d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1077.782864] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2160e1d9-4b1c-4d95-9875-324e72a7ab5d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1077.797728] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f13bea5-5e45-4afc-97b0-8d088995ac45 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1077.804416] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a9225a-707d-45f3-bebd-906500583fb4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1077.835420] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181203MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1077.835585] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1077.835793] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.013621] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.013796] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.013935] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.014115] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.014302] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.014497] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.014662] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.014845] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.014982] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.015113] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1078.029643] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.041061] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance cceddbb3-f076-4b72-882f-71432f8f0a81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.052356] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2a45087b-e101-49fd-b102-abb56b8b88e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.062580] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b9ce0d5b-0ee9-4585-9265-10a96ea62752 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.072716] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3da33ce7-b346-4970-b4ab-36a74c67d3dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.082571] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 854ad83b-7e4d-4f0d-b6d9-4e9492fc3461 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.091874] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.103479] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 77e5ef91-47b8-4d27-a899-8f4a910851b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.114247] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f4a6ac93-39eb-4a36-93d3-b01150092707 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.124372] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 027da562-4bf7-436d-bd68-af586797587a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.134342] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6489648d-415a-4625-9ac6-7ee30622c8bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.146309] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 49f64e1c-063b-4483-bd68-7423b72ea4a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.156060] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0ea664eb-1978-4725-b8a5-75ce53f0d165 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.167136] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2da6479b-4b3a-4d7d-91cb-81b563b11732 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.176503] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.187785] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.197393] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3265e0a4-28f2-4484-a164-4dc5af01d6ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.213013] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1078.213013] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1078.213177] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1078.228835] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing inventories for resource provider 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1078.242549] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating ProviderTree inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1078.242737] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1078.253376] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing aggregate associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, aggregates: None {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1078.272404] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing trait associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1078.640622] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b40425f-0a7c-4572-a4bf-3cccfc1d39f9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.649306] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b029a1a-a655-43b8-a6cd-8f57b28ad778 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.681518] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d8d1e4d-c106-4edc-a984-1b9b0b7d5547 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.689452] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-358f3fa8-03fd-4f24-b743-e64d588c7d02 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.703393] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1078.712725] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1078.727435] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1078.727712] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.892s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1078.727862] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1080.735332] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1080.754136] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1081.754515] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1081.754795] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1081.765374] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] There are 0 instances to clean {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 1112.917023] env[60788]: WARNING oslo_vmware.rw_handles [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1112.917023] env[60788]: ERROR oslo_vmware.rw_handles [ 1112.917637] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1112.919544] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1112.919798] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Copying Virtual Disk [datastore2] vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/dc280724-562b-4516-86a0-b95f34b54b8a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1112.920105] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f510b0eb-39ec-4436-a799-87b4a01f6e58 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1112.927626] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 1112.927626] env[60788]: value = "task-2205205" [ 1112.927626] env[60788]: _type = "Task" [ 1112.927626] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1112.935972] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': task-2205205, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1113.438365] env[60788]: DEBUG oslo_vmware.exceptions [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1113.438678] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1113.439233] env[60788]: ERROR nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1113.439233] env[60788]: Faults: ['InvalidArgument'] [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Traceback (most recent call last): [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] yield resources [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self.driver.spawn(context, instance, image_meta, [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self._fetch_image_if_missing(context, vi) [ 1113.439233] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] image_cache(vi, tmp_image_ds_loc) [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] vm_util.copy_virtual_disk( [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] session._wait_for_task(vmdk_copy_task) [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] return self.wait_for_task(task_ref) [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] return evt.wait() [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] result = hub.switch() [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1113.439642] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] return self.greenlet.switch() [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self.f(*self.args, **self.kw) [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] raise exceptions.translate_fault(task_info.error) [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Faults: ['InvalidArgument'] [ 1113.440027] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] [ 1113.440027] env[60788]: INFO nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Terminating instance [ 1113.441119] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1113.441338] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1113.441584] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5c6e222a-73c5-441f-8fe2-8ceb28650f28 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.443793] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1113.443987] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1113.444752] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-615285e7-a69e-4276-a541-c2e2f5b0ce31 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.451777] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1113.452016] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b26a7220-9118-44e3-886f-cd5e6856ba29 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.454441] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1113.454616] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1113.455591] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-651926df-0040-453e-8dad-3a3bf5275cd9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.460326] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Waiting for the task: (returnval){ [ 1113.460326] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52e02634-3ec2-4460-4f35-862a58283ea4" [ 1113.460326] env[60788]: _type = "Task" [ 1113.460326] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1113.467594] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52e02634-3ec2-4460-4f35-862a58283ea4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1113.527037] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1113.527279] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1113.527471] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Deleting the datastore file [datastore2] aa3bf189-1b7a-40eb-a270-711920dd84a6 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1113.527743] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e8749084-5c82-41c5-9132-490f2ec4822d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.533880] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for the task: (returnval){ [ 1113.533880] env[60788]: value = "task-2205207" [ 1113.533880] env[60788]: _type = "Task" [ 1113.533880] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1113.541580] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': task-2205207, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1113.970431] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1113.970805] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Creating directory with path [datastore2] vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1113.971045] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c73d0ffc-f8af-4bc7-8453-9ce31989639c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.982609] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Created directory with path [datastore2] vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1113.982813] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Fetch image to [datastore2] vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1113.982994] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1113.983746] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c23bdbb-9ea8-405b-85cd-fa57c3761665 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.990375] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09782399-b758-4e23-a4f9-dc139f7cd070 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1113.999281] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a68c393-d79b-498d-8b55-af3a53796341 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.031167] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-914a3b6d-a57e-4583-8389-6c354999d2d9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.039066] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-73b84e5c-55ea-450e-8958-fdea4d6fd52d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.043262] env[60788]: DEBUG oslo_vmware.api [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Task: {'id': task-2205207, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066155} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1114.043753] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1114.043935] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1114.044118] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1114.044292] env[60788]: INFO nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1114.046340] env[60788]: DEBUG nova.compute.claims [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1114.046512] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.046740] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1114.063315] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1114.121426] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1114.181959] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1114.182167] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1114.440719] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb48f603-a273-4648-8f63-eff55d0d4255 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.448272] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0db3a4b0-afd6-4a97-91a4-acf8dc448bb7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.477410] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aacec1f2-e0a2-4f73-9030-3e5d86e65faa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.484212] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bba94c1b-68ed-4897-b9f5-dc0160daca79 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.497829] env[60788]: DEBUG nova.compute.provider_tree [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1114.506120] env[60788]: DEBUG nova.scheduler.client.report [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1114.518812] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.472s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1114.519338] env[60788]: ERROR nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1114.519338] env[60788]: Faults: ['InvalidArgument'] [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Traceback (most recent call last): [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self.driver.spawn(context, instance, image_meta, [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self._fetch_image_if_missing(context, vi) [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] image_cache(vi, tmp_image_ds_loc) [ 1114.519338] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] vm_util.copy_virtual_disk( [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] session._wait_for_task(vmdk_copy_task) [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] return self.wait_for_task(task_ref) [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] return evt.wait() [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] result = hub.switch() [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] return self.greenlet.switch() [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1114.519743] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] self.f(*self.args, **self.kw) [ 1114.520148] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1114.520148] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] raise exceptions.translate_fault(task_info.error) [ 1114.520148] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1114.520148] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Faults: ['InvalidArgument'] [ 1114.520148] env[60788]: ERROR nova.compute.manager [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] [ 1114.520148] env[60788]: DEBUG nova.compute.utils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1114.521361] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Build of instance aa3bf189-1b7a-40eb-a270-711920dd84a6 was re-scheduled: A specified parameter was not correct: fileType [ 1114.521361] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1114.522030] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1114.522030] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1114.522168] env[60788]: DEBUG nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1114.522213] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1115.018713] env[60788]: DEBUG nova.network.neutron [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1115.031275] env[60788]: INFO nova.compute.manager [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Took 0.51 seconds to deallocate network for instance. [ 1115.129453] env[60788]: INFO nova.scheduler.client.report [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Deleted allocations for instance aa3bf189-1b7a-40eb-a270-711920dd84a6 [ 1115.148443] env[60788]: DEBUG oslo_concurrency.lockutils [None req-650c655c-af39-4f74-bab7-e20c95ba6a39 tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 575.171s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1115.150235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 375.835s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1115.150235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Acquiring lock "aa3bf189-1b7a-40eb-a270-711920dd84a6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1115.150235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1115.150235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1115.152026] env[60788]: INFO nova.compute.manager [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Terminating instance [ 1115.153837] env[60788]: DEBUG nova.compute.manager [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1115.154038] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1115.154337] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4f98b487-7763-4da5-b6fa-199892c7abae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.164369] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4f97de3-ee89-467c-868b-e7fbafd26d7c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.177105] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1115.195007] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aa3bf189-1b7a-40eb-a270-711920dd84a6 could not be found. [ 1115.195256] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1115.195532] env[60788]: INFO nova.compute.manager [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1115.195797] env[60788]: DEBUG oslo.service.loopingcall [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1115.196025] env[60788]: DEBUG nova.compute.manager [-] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1115.196141] env[60788]: DEBUG nova.network.neutron [-] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1115.223461] env[60788]: DEBUG nova.network.neutron [-] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1115.230813] env[60788]: INFO nova.compute.manager [-] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] Took 0.03 seconds to deallocate network for instance. [ 1115.232947] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1115.233214] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1115.234846] env[60788]: INFO nova.compute.claims [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1115.329556] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3e4ec870-dcc5-4886-800e-86623ac8736d tempest-ServersAdminTestJSON-756204499 tempest-ServersAdminTestJSON-756204499-project-member] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1115.330442] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 39.579s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1115.330640] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: aa3bf189-1b7a-40eb-a270-711920dd84a6] During sync_power_state the instance has a pending task (deleting). Skip. [ 1115.330812] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "aa3bf189-1b7a-40eb-a270-711920dd84a6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1115.578659] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28abe3c0-1ae8-4942-ae40-f804479e42c3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.589249] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6015dec4-5411-4b52-85a0-54264bd3594a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.621534] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb97258-4a77-4cb1-8c1c-0393838039ba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.628992] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cef9f18e-13d2-4e3e-b437-9b9cadb73444 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.642077] env[60788]: DEBUG nova.compute.provider_tree [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1115.651830] env[60788]: DEBUG nova.scheduler.client.report [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1115.665649] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.432s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1115.666093] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1115.698563] env[60788]: DEBUG nova.compute.utils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1115.699911] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1115.700094] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1115.708389] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1115.772053] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1115.798431] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1115.798749] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1115.798922] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1115.799356] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1115.799436] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1115.799555] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1115.799782] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1115.799947] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1115.800186] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1115.800298] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1115.800497] env[60788]: DEBUG nova.virt.hardware [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1115.801369] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08ff360b-3846-4211-9db0-70827a4e6694 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1115.809819] env[60788]: DEBUG nova.policy [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9608a7d578f54e3aa974e37153821d4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '936e92b1754a415b9b9d7cff62af1e2b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1115.812246] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12cdf405-b6ab-43c1-9db6-0056ff561d85 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.364009] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Successfully created port: 6c1e3bf6-4746-4657-9b8d-9d4626be79c7 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1117.366078] env[60788]: DEBUG nova.compute.manager [req-4de93149-9184-407e-905e-0d3c88596b03 req-c500f737-73bc-4172-93ce-6084b17ce059 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Received event network-vif-plugged-6c1e3bf6-4746-4657-9b8d-9d4626be79c7 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1117.366354] env[60788]: DEBUG oslo_concurrency.lockutils [req-4de93149-9184-407e-905e-0d3c88596b03 req-c500f737-73bc-4172-93ce-6084b17ce059 service nova] Acquiring lock "529472d7-5e71-4997-96de-64d41b9d3515-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1117.366507] env[60788]: DEBUG oslo_concurrency.lockutils [req-4de93149-9184-407e-905e-0d3c88596b03 req-c500f737-73bc-4172-93ce-6084b17ce059 service nova] Lock "529472d7-5e71-4997-96de-64d41b9d3515-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1117.366744] env[60788]: DEBUG oslo_concurrency.lockutils [req-4de93149-9184-407e-905e-0d3c88596b03 req-c500f737-73bc-4172-93ce-6084b17ce059 service nova] Lock "529472d7-5e71-4997-96de-64d41b9d3515-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1117.366839] env[60788]: DEBUG nova.compute.manager [req-4de93149-9184-407e-905e-0d3c88596b03 req-c500f737-73bc-4172-93ce-6084b17ce059 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] No waiting events found dispatching network-vif-plugged-6c1e3bf6-4746-4657-9b8d-9d4626be79c7 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1117.366981] env[60788]: WARNING nova.compute.manager [req-4de93149-9184-407e-905e-0d3c88596b03 req-c500f737-73bc-4172-93ce-6084b17ce059 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Received unexpected event network-vif-plugged-6c1e3bf6-4746-4657-9b8d-9d4626be79c7 for instance with vm_state building and task_state spawning. [ 1117.421592] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Successfully updated port: 6c1e3bf6-4746-4657-9b8d-9d4626be79c7 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1117.435485] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "refresh_cache-529472d7-5e71-4997-96de-64d41b9d3515" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1117.435648] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "refresh_cache-529472d7-5e71-4997-96de-64d41b9d3515" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1117.435972] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1117.512797] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1117.777677] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Updating instance_info_cache with network_info: [{"id": "6c1e3bf6-4746-4657-9b8d-9d4626be79c7", "address": "fa:16:3e:52:6b:91", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c1e3bf6-47", "ovs_interfaceid": "6c1e3bf6-4746-4657-9b8d-9d4626be79c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1117.791929] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "refresh_cache-529472d7-5e71-4997-96de-64d41b9d3515" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1117.791929] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance network_info: |[{"id": "6c1e3bf6-4746-4657-9b8d-9d4626be79c7", "address": "fa:16:3e:52:6b:91", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c1e3bf6-47", "ovs_interfaceid": "6c1e3bf6-4746-4657-9b8d-9d4626be79c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1117.792114] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:52:6b:91', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed3ffc1d-9f86-4029-857e-6cd1d383edbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6c1e3bf6-4746-4657-9b8d-9d4626be79c7', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1117.799380] env[60788]: DEBUG oslo.service.loopingcall [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1117.799881] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1117.800162] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4a3f8283-d754-4887-94e7-b6f899b512d9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.820943] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1117.820943] env[60788]: value = "task-2205208" [ 1117.820943] env[60788]: _type = "Task" [ 1117.820943] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1117.829650] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205208, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1118.331478] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205208, 'name': CreateVM_Task, 'duration_secs': 0.285313} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1118.331579] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1118.332303] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1118.332470] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1118.332779] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1118.333040] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3a2bde8f-ae40-4c5d-a235-c2f96e6c94c7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1118.337992] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1118.337992] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52ebb748-4560-81ca-666b-b9042c7081fc" [ 1118.337992] env[60788]: _type = "Task" [ 1118.337992] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1118.345558] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52ebb748-4560-81ca-666b-b9042c7081fc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1118.848487] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1118.848895] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1118.849172] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1119.414804] env[60788]: DEBUG nova.compute.manager [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Received event network-changed-6c1e3bf6-4746-4657-9b8d-9d4626be79c7 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1119.415118] env[60788]: DEBUG nova.compute.manager [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Refreshing instance network info cache due to event network-changed-6c1e3bf6-4746-4657-9b8d-9d4626be79c7. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1119.415327] env[60788]: DEBUG oslo_concurrency.lockutils [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] Acquiring lock "refresh_cache-529472d7-5e71-4997-96de-64d41b9d3515" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1119.415504] env[60788]: DEBUG oslo_concurrency.lockutils [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] Acquired lock "refresh_cache-529472d7-5e71-4997-96de-64d41b9d3515" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1119.415680] env[60788]: DEBUG nova.network.neutron [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Refreshing network info cache for port 6c1e3bf6-4746-4657-9b8d-9d4626be79c7 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1119.769271] env[60788]: DEBUG nova.network.neutron [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Updated VIF entry in instance network info cache for port 6c1e3bf6-4746-4657-9b8d-9d4626be79c7. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1119.769631] env[60788]: DEBUG nova.network.neutron [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Updating instance_info_cache with network_info: [{"id": "6c1e3bf6-4746-4657-9b8d-9d4626be79c7", "address": "fa:16:3e:52:6b:91", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c1e3bf6-47", "ovs_interfaceid": "6c1e3bf6-4746-4657-9b8d-9d4626be79c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.780301] env[60788]: DEBUG oslo_concurrency.lockutils [req-59aa3523-2673-495f-b0e2-793c112a71ec req-f3f74de4-e73c-41a6-bf0b-20d5ad1c3b55 service nova] Releasing lock "refresh_cache-529472d7-5e71-4997-96de-64d41b9d3515" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1120.128045] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "d63f9834-818b-4087-851c-d7394d20b89d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1120.128045] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "d63f9834-818b-4087-851c-d7394d20b89d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1129.427319] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "529472d7-5e71-4997-96de-64d41b9d3515" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1132.764586] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1135.391358] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "688ff077-9505-48f5-9117-0a7f115f254c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1135.391358] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "688ff077-9505-48f5-9117-0a7f115f254c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1136.754159] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1136.754472] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1137.749224] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1137.752881] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1137.753059] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1137.753185] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1137.777527] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.778142] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.778828] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.778828] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.778828] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.778828] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.779087] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.779087] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.779171] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.779262] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1137.779392] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1137.779898] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1137.790926] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1137.791209] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1137.791330] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1137.791528] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1137.792567] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5f9962e-3d28-457a-b01f-c9b8462536e6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.801822] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ed2946-9d91-44a1-9b81-9d7a7cf1db4c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.816387] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd249446-ca13-4dca-8f94-fa18e8e18ee1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.822695] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96d883f1-3bf7-448d-bdb7-44601f2a6e18 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1137.853325] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181260MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1137.853476] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1137.853669] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1137.927853] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0259d811-2677-4164-94cd-5c4f5d935f50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928032] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928168] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928323] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928453] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928575] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928694] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928810] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.928924] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.929050] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1137.941069] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance b9ce0d5b-0ee9-4585-9265-10a96ea62752 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1137.951841] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3da33ce7-b346-4970-b4ab-36a74c67d3dd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1137.961835] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 854ad83b-7e4d-4f0d-b6d9-4e9492fc3461 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1137.971804] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1137.981369] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 77e5ef91-47b8-4d27-a899-8f4a910851b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1137.992298] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f4a6ac93-39eb-4a36-93d3-b01150092707 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.002545] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 027da562-4bf7-436d-bd68-af586797587a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.011968] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6489648d-415a-4625-9ac6-7ee30622c8bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.021161] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 49f64e1c-063b-4483-bd68-7423b72ea4a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.031108] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 0ea664eb-1978-4725-b8a5-75ce53f0d165 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.040936] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2da6479b-4b3a-4d7d-91cb-81b563b11732 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.050373] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.060373] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.070985] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3265e0a4-28f2-4484-a164-4dc5af01d6ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.079868] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.088870] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.097661] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1138.097947] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1138.098110] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1138.373752] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0906d2a2-418b-490f-b5d4-2245ae3b2b42 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.381617] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e158390-6629-48eb-9705-d33d3ed303b5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.410959] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db9d6c9b-9520-4f85-bcf5-b54c92559863 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.417413] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da955827-dd87-43cc-b3e6-9f7c43414462 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1138.430142] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1138.438475] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1138.454356] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1138.454547] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1139.429279] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1139.429648] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1140.753830] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1141.754575] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1160.770272] env[60788]: WARNING oslo_vmware.rw_handles [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1160.770272] env[60788]: ERROR oslo_vmware.rw_handles [ 1160.771150] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1160.773385] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1160.773644] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Copying Virtual Disk [datastore2] vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/1f17c1b8-b062-4ded-b76d-3aea40c9fb74/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1160.773931] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0703742b-cb03-4ee7-bc68-1d5d2747ddce {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1160.782550] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Waiting for the task: (returnval){ [ 1160.782550] env[60788]: value = "task-2205209" [ 1160.782550] env[60788]: _type = "Task" [ 1160.782550] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1160.791076] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Task: {'id': task-2205209, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1161.293489] env[60788]: DEBUG oslo_vmware.exceptions [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1161.293753] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1161.294307] env[60788]: ERROR nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1161.294307] env[60788]: Faults: ['InvalidArgument'] [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Traceback (most recent call last): [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] yield resources [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self.driver.spawn(context, instance, image_meta, [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self._fetch_image_if_missing(context, vi) [ 1161.294307] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] image_cache(vi, tmp_image_ds_loc) [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] vm_util.copy_virtual_disk( [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] session._wait_for_task(vmdk_copy_task) [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] return self.wait_for_task(task_ref) [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] return evt.wait() [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] result = hub.switch() [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1161.294695] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] return self.greenlet.switch() [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self.f(*self.args, **self.kw) [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] raise exceptions.translate_fault(task_info.error) [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Faults: ['InvalidArgument'] [ 1161.295080] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] [ 1161.295080] env[60788]: INFO nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Terminating instance [ 1161.296305] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1161.296516] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1161.296758] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b774495-b5f7-436c-8789-863a0461f34b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.300044] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1161.300694] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1161.300960] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5813df5-0b15-4e4e-b780-08c2a4d1334e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.308158] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1161.308417] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-30c505ef-cd2c-40c9-9732-b62ac3ddb60f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.310810] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1161.310986] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1161.312065] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c2319f5a-c073-49eb-8cf6-89aa2fc3877d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.316731] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1161.316731] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5251d8d4-d259-f77c-5025-5b6058918ca3" [ 1161.316731] env[60788]: _type = "Task" [ 1161.316731] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1161.324220] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5251d8d4-d259-f77c-5025-5b6058918ca3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1161.377015] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1161.377341] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1161.377513] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Deleting the datastore file [datastore2] 0259d811-2677-4164-94cd-5c4f5d935f50 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1161.377797] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dd3c827b-1cfd-4ab0-a650-7937191c1597 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.384428] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Waiting for the task: (returnval){ [ 1161.384428] env[60788]: value = "task-2205211" [ 1161.384428] env[60788]: _type = "Task" [ 1161.384428] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1161.392379] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Task: {'id': task-2205211, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1161.828354] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1161.828632] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating directory with path [datastore2] vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1161.828893] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e95a00fb-2d1b-409e-af87-6dfae933b2f3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.840292] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created directory with path [datastore2] vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1161.840496] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Fetch image to [datastore2] vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1161.840682] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1161.841461] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6684d93-12f9-4b26-85be-f2fac5e2cc78 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.848538] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efc17cd5-073e-4cfa-aec6-998de378a3e7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.858120] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fe9f00d-8515-4909-9d0d-294aa2f5bdce {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.895020] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e096425-29c7-4e3f-9059-a856ff8e27c1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.902331] env[60788]: DEBUG oslo_vmware.api [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Task: {'id': task-2205211, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075479} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1161.903962] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1161.904161] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1161.904342] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1161.904522] env[60788]: INFO nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1161.906500] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b6d6de8a-12f9-4e48-b71a-5f1bf0530038 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1161.912138] env[60788]: DEBUG nova.compute.claims [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1161.912138] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1161.912138] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1161.929220] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1162.113877] env[60788]: DEBUG oslo_vmware.rw_handles [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1162.188148] env[60788]: DEBUG oslo_vmware.rw_handles [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1162.188148] env[60788]: DEBUG oslo_vmware.rw_handles [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1162.371832] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81493732-fee4-4308-8a85-8f7254ea2bef {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.380574] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-993aab16-7498-43c1-b207-c540c28bfcb4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.419723] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e133bf91-f1a5-4b2d-ba5d-a90c6329277e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.428140] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce181af7-9887-4889-9693-131d42ace136 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1162.441784] env[60788]: DEBUG nova.compute.provider_tree [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1162.450942] env[60788]: DEBUG nova.scheduler.client.report [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1162.468766] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.559s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1162.469329] env[60788]: ERROR nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1162.469329] env[60788]: Faults: ['InvalidArgument'] [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Traceback (most recent call last): [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self.driver.spawn(context, instance, image_meta, [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self._fetch_image_if_missing(context, vi) [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] image_cache(vi, tmp_image_ds_loc) [ 1162.469329] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] vm_util.copy_virtual_disk( [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] session._wait_for_task(vmdk_copy_task) [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] return self.wait_for_task(task_ref) [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] return evt.wait() [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] result = hub.switch() [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] return self.greenlet.switch() [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1162.469684] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] self.f(*self.args, **self.kw) [ 1162.470044] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1162.470044] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] raise exceptions.translate_fault(task_info.error) [ 1162.470044] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1162.470044] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Faults: ['InvalidArgument'] [ 1162.470044] env[60788]: ERROR nova.compute.manager [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] [ 1162.470044] env[60788]: DEBUG nova.compute.utils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1162.471998] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Build of instance 0259d811-2677-4164-94cd-5c4f5d935f50 was re-scheduled: A specified parameter was not correct: fileType [ 1162.471998] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1162.472386] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1162.472595] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1162.472798] env[60788]: DEBUG nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1162.472992] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1163.529678] env[60788]: DEBUG nova.network.neutron [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1163.541982] env[60788]: INFO nova.compute.manager [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Took 1.07 seconds to deallocate network for instance. [ 1163.674341] env[60788]: INFO nova.scheduler.client.report [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Deleted allocations for instance 0259d811-2677-4164-94cd-5c4f5d935f50 [ 1163.700071] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4d5e6527-50ad-4d9e-b54d-cc3d31728343 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "0259d811-2677-4164-94cd-5c4f5d935f50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 621.368s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.701184] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "0259d811-2677-4164-94cd-5c4f5d935f50" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 422.521s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1163.701383] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Acquiring lock "0259d811-2677-4164-94cd-5c4f5d935f50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1163.701533] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "0259d811-2677-4164-94cd-5c4f5d935f50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1163.701753] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "0259d811-2677-4164-94cd-5c4f5d935f50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.704427] env[60788]: INFO nova.compute.manager [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Terminating instance [ 1163.706208] env[60788]: DEBUG nova.compute.manager [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1163.707064] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1163.707064] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8957c1e2-d564-4967-9f5a-f1e7d9b6f4c6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.717357] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f749c52f-df7c-4d2d-ba00-5f42ba05708b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1163.729375] env[60788]: DEBUG nova.compute.manager [None req-756087a5-abc9-49f2-bbb7-12d03d5ae899 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: cceddbb3-f076-4b72-882f-71432f8f0a81] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1163.754723] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0259d811-2677-4164-94cd-5c4f5d935f50 could not be found. [ 1163.755029] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1163.755230] env[60788]: INFO nova.compute.manager [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1163.755560] env[60788]: DEBUG oslo.service.loopingcall [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1163.755893] env[60788]: DEBUG nova.compute.manager [-] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1163.755998] env[60788]: DEBUG nova.network.neutron [-] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1163.762809] env[60788]: DEBUG nova.compute.manager [None req-756087a5-abc9-49f2-bbb7-12d03d5ae899 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: cceddbb3-f076-4b72-882f-71432f8f0a81] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1163.788467] env[60788]: DEBUG oslo_concurrency.lockutils [None req-756087a5-abc9-49f2-bbb7-12d03d5ae899 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "cceddbb3-f076-4b72-882f-71432f8f0a81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.585s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.795633] env[60788]: DEBUG nova.network.neutron [-] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1163.804341] env[60788]: DEBUG nova.compute.manager [None req-6caf0247-3988-4f45-a998-126613f0c5d8 tempest-ServerRescueTestJSON-23639953 tempest-ServerRescueTestJSON-23639953-project-member] [instance: 2a45087b-e101-49fd-b102-abb56b8b88e0] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1163.807358] env[60788]: INFO nova.compute.manager [-] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] Took 0.05 seconds to deallocate network for instance. [ 1163.831293] env[60788]: DEBUG nova.compute.manager [None req-6caf0247-3988-4f45-a998-126613f0c5d8 tempest-ServerRescueTestJSON-23639953 tempest-ServerRescueTestJSON-23639953-project-member] [instance: 2a45087b-e101-49fd-b102-abb56b8b88e0] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1163.867732] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6caf0247-3988-4f45-a998-126613f0c5d8 tempest-ServerRescueTestJSON-23639953 tempest-ServerRescueTestJSON-23639953-project-member] Lock "2a45087b-e101-49fd-b102-abb56b8b88e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.484s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.879276] env[60788]: DEBUG nova.compute.manager [None req-008ae44a-6601-4421-84df-c8d1a9b92f59 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] [instance: b9ce0d5b-0ee9-4585-9265-10a96ea62752] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1163.906592] env[60788]: DEBUG nova.compute.manager [None req-008ae44a-6601-4421-84df-c8d1a9b92f59 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] [instance: b9ce0d5b-0ee9-4585-9265-10a96ea62752] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1163.944506] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfa80c2b-a178-4024-9acc-1612cd689653 tempest-AttachInterfacesUnderV243Test-1719757548 tempest-AttachInterfacesUnderV243Test-1719757548-project-member] Lock "0259d811-2677-4164-94cd-5c4f5d935f50" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.243s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.945680] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "0259d811-2677-4164-94cd-5c4f5d935f50" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 88.194s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1163.945964] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 0259d811-2677-4164-94cd-5c4f5d935f50] During sync_power_state the instance has a pending task (deleting). Skip. [ 1163.946197] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "0259d811-2677-4164-94cd-5c4f5d935f50" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.948893] env[60788]: DEBUG oslo_concurrency.lockutils [None req-008ae44a-6601-4421-84df-c8d1a9b92f59 tempest-AttachVolumeNegativeTest-1916379500 tempest-AttachVolumeNegativeTest-1916379500-project-member] Lock "b9ce0d5b-0ee9-4585-9265-10a96ea62752" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.721s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.958153] env[60788]: DEBUG nova.compute.manager [None req-c07ad454-ec60-4ee6-9a01-5a2004ff26b2 tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] [instance: 3da33ce7-b346-4970-b4ab-36a74c67d3dd] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1163.989232] env[60788]: DEBUG nova.compute.manager [None req-c07ad454-ec60-4ee6-9a01-5a2004ff26b2 tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] [instance: 3da33ce7-b346-4970-b4ab-36a74c67d3dd] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1164.012413] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c07ad454-ec60-4ee6-9a01-5a2004ff26b2 tempest-AttachVolumeTestJSON-946912437 tempest-AttachVolumeTestJSON-946912437-project-member] Lock "3da33ce7-b346-4970-b4ab-36a74c67d3dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.027s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1164.022195] env[60788]: DEBUG nova.compute.manager [None req-6465b186-24ca-4e90-b6e9-d3908cdfd1e1 tempest-ServersTestBootFromVolume-1578403687 tempest-ServersTestBootFromVolume-1578403687-project-member] [instance: 854ad83b-7e4d-4f0d-b6d9-4e9492fc3461] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1164.047162] env[60788]: DEBUG nova.compute.manager [None req-6465b186-24ca-4e90-b6e9-d3908cdfd1e1 tempest-ServersTestBootFromVolume-1578403687 tempest-ServersTestBootFromVolume-1578403687-project-member] [instance: 854ad83b-7e4d-4f0d-b6d9-4e9492fc3461] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1164.072993] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6465b186-24ca-4e90-b6e9-d3908cdfd1e1 tempest-ServersTestBootFromVolume-1578403687 tempest-ServersTestBootFromVolume-1578403687-project-member] Lock "854ad83b-7e4d-4f0d-b6d9-4e9492fc3461" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.597s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1164.090084] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1164.147460] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1164.148208] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1164.149736] env[60788]: INFO nova.compute.claims [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1164.488041] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59466082-72ab-4d23-951d-33ca94d5bc71 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1164.495875] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d05f7e70-6f3e-41bc-8b03-e51365aa0c0c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1164.527109] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17bb5188-0b97-4785-8955-9b715cf7e16e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1164.534244] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-313e7a9e-a5d3-4330-a126-6857de57d08d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1164.547071] env[60788]: DEBUG nova.compute.provider_tree [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1164.557528] env[60788]: DEBUG nova.scheduler.client.report [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1164.575037] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.427s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1164.575420] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1164.631434] env[60788]: DEBUG nova.compute.utils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1164.633009] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1164.633009] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1164.644294] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1164.723027] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1164.730384] env[60788]: DEBUG nova.policy [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3fd0f3818eda48409da3a8977e2d963b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2313b232c99a4a16a40e01fec91c13f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1164.760030] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1164.760530] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1164.760826] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1164.762018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1164.762018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1164.762018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1164.762018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1164.762018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1164.762293] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1164.766018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1164.766018] env[60788]: DEBUG nova.virt.hardware [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1164.766018] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc947b9c-68b2-4e6a-b38a-1b519b25ec4f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1164.773111] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e41b84a9-b9e7-427f-8315-c2f384d41d7a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1165.355181] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Successfully created port: 8e65a732-c16c-4504-aa57-5e715d6d414f {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1166.069293] env[60788]: DEBUG nova.compute.manager [req-713479bb-d588-4463-8c18-36a70814ad19 req-79ac094a-cda1-43c6-bcda-0aa8ace12aa2 service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Received event network-vif-plugged-8e65a732-c16c-4504-aa57-5e715d6d414f {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1166.069561] env[60788]: DEBUG oslo_concurrency.lockutils [req-713479bb-d588-4463-8c18-36a70814ad19 req-79ac094a-cda1-43c6-bcda-0aa8ace12aa2 service nova] Acquiring lock "28605b2e-9795-47a0-821c-5cf8da077d37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1166.069749] env[60788]: DEBUG oslo_concurrency.lockutils [req-713479bb-d588-4463-8c18-36a70814ad19 req-79ac094a-cda1-43c6-bcda-0aa8ace12aa2 service nova] Lock "28605b2e-9795-47a0-821c-5cf8da077d37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1166.069917] env[60788]: DEBUG oslo_concurrency.lockutils [req-713479bb-d588-4463-8c18-36a70814ad19 req-79ac094a-cda1-43c6-bcda-0aa8ace12aa2 service nova] Lock "28605b2e-9795-47a0-821c-5cf8da077d37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1166.070362] env[60788]: DEBUG nova.compute.manager [req-713479bb-d588-4463-8c18-36a70814ad19 req-79ac094a-cda1-43c6-bcda-0aa8ace12aa2 service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] No waiting events found dispatching network-vif-plugged-8e65a732-c16c-4504-aa57-5e715d6d414f {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1166.070612] env[60788]: WARNING nova.compute.manager [req-713479bb-d588-4463-8c18-36a70814ad19 req-79ac094a-cda1-43c6-bcda-0aa8ace12aa2 service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Received unexpected event network-vif-plugged-8e65a732-c16c-4504-aa57-5e715d6d414f for instance with vm_state building and task_state spawning. [ 1166.209748] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Successfully updated port: 8e65a732-c16c-4504-aa57-5e715d6d414f {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1166.221317] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "refresh_cache-28605b2e-9795-47a0-821c-5cf8da077d37" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1166.221466] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "refresh_cache-28605b2e-9795-47a0-821c-5cf8da077d37" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1166.221617] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1166.281635] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1166.758220] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Updating instance_info_cache with network_info: [{"id": "8e65a732-c16c-4504-aa57-5e715d6d414f", "address": "fa:16:3e:4d:8b:23", "network": {"id": "31e5241c-3d65-4624-bcf6-45322ce7ebd6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1793513747-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2313b232c99a4a16a40e01fec91c13f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbdab640-5fea-4254-8bd3-f855b7eaca0d", "external-id": "nsx-vlan-transportzone-615", "segmentation_id": 615, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e65a732-c1", "ovs_interfaceid": "8e65a732-c16c-4504-aa57-5e715d6d414f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1166.777451] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "refresh_cache-28605b2e-9795-47a0-821c-5cf8da077d37" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1166.778321] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance network_info: |[{"id": "8e65a732-c16c-4504-aa57-5e715d6d414f", "address": "fa:16:3e:4d:8b:23", "network": {"id": "31e5241c-3d65-4624-bcf6-45322ce7ebd6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1793513747-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2313b232c99a4a16a40e01fec91c13f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbdab640-5fea-4254-8bd3-f855b7eaca0d", "external-id": "nsx-vlan-transportzone-615", "segmentation_id": 615, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e65a732-c1", "ovs_interfaceid": "8e65a732-c16c-4504-aa57-5e715d6d414f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1166.778497] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4d:8b:23', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dbdab640-5fea-4254-8bd3-f855b7eaca0d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8e65a732-c16c-4504-aa57-5e715d6d414f', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1166.788576] env[60788]: DEBUG oslo.service.loopingcall [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1166.788576] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1166.788576] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c5afe58a-a521-43ad-9dbb-4027c26662df {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1166.809167] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1166.809167] env[60788]: value = "task-2205212" [ 1166.809167] env[60788]: _type = "Task" [ 1166.809167] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1166.817802] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205212, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1167.319913] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205212, 'name': CreateVM_Task, 'duration_secs': 0.408714} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1167.320223] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1167.335755] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1167.335982] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1167.336314] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1167.336587] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7cd113a3-589b-4686-977d-6a689d73e888 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.341464] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1167.341464] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]522ca400-643c-c087-8717-6ed08558ce16" [ 1167.341464] env[60788]: _type = "Task" [ 1167.341464] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1167.350216] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]522ca400-643c-c087-8717-6ed08558ce16, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1167.852676] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1167.852942] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1167.853169] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1168.132060] env[60788]: DEBUG nova.compute.manager [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Received event network-changed-8e65a732-c16c-4504-aa57-5e715d6d414f {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1168.132301] env[60788]: DEBUG nova.compute.manager [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Refreshing instance network info cache due to event network-changed-8e65a732-c16c-4504-aa57-5e715d6d414f. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1168.132507] env[60788]: DEBUG oslo_concurrency.lockutils [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] Acquiring lock "refresh_cache-28605b2e-9795-47a0-821c-5cf8da077d37" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1168.132632] env[60788]: DEBUG oslo_concurrency.lockutils [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] Acquired lock "refresh_cache-28605b2e-9795-47a0-821c-5cf8da077d37" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1168.132792] env[60788]: DEBUG nova.network.neutron [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Refreshing network info cache for port 8e65a732-c16c-4504-aa57-5e715d6d414f {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1168.713137] env[60788]: DEBUG nova.network.neutron [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Updated VIF entry in instance network info cache for port 8e65a732-c16c-4504-aa57-5e715d6d414f. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1168.713825] env[60788]: DEBUG nova.network.neutron [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Updating instance_info_cache with network_info: [{"id": "8e65a732-c16c-4504-aa57-5e715d6d414f", "address": "fa:16:3e:4d:8b:23", "network": {"id": "31e5241c-3d65-4624-bcf6-45322ce7ebd6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1793513747-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "2313b232c99a4a16a40e01fec91c13f2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dbdab640-5fea-4254-8bd3-f855b7eaca0d", "external-id": "nsx-vlan-transportzone-615", "segmentation_id": 615, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8e65a732-c1", "ovs_interfaceid": "8e65a732-c16c-4504-aa57-5e715d6d414f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1168.728544] env[60788]: DEBUG oslo_concurrency.lockutils [req-c7ecc31d-09af-4ce3-ad81-fa0cb9e86016 req-6c81c6ac-83f3-46e4-9030-0e5074e4145e service nova] Releasing lock "refresh_cache-28605b2e-9795-47a0-821c-5cf8da077d37" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1169.413333] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "58bbe972-5fc1-4627-90e4-91251e047e86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1169.413615] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "58bbe972-5fc1-4627-90e4-91251e047e86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1173.924921] env[60788]: DEBUG oslo_concurrency.lockutils [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "28605b2e-9795-47a0-821c-5cf8da077d37" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.619344] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1174.619607] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1183.427795] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5ce8980-0966-470a-b2f9-b538df6c637e tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "bbfb23ab-0f4d-4195-ad4f-12b405a28267" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1183.428101] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5ce8980-0966-470a-b2f9-b538df6c637e tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "bbfb23ab-0f4d-4195-ad4f-12b405a28267" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1193.750945] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1194.753552] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1197.754035] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1197.754276] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1198.749156] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1198.753988] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1199.754582] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1199.754845] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1199.754980] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1199.776865] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777042] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777179] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777313] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777514] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777707] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777769] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.777972] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.778028] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.778770] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1199.778919] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1199.779408] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1199.779598] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1199.792867] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.793106] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.793433] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1199.793433] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1199.794628] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a58e9437-3df5-494c-95e0-f162eab8b0b4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.808415] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-388b2ad5-3210-439b-9d2a-3388e9953d2e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.826848] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f304d1e-c8b2-4933-b4e7-8b6432e5f107 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.835027] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e77bc89-572d-4613-b0a8-40acc1bd10fd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1199.867600] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181269MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1199.867700] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.867951] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.965238] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969069] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969069] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969069] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969069] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969418] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969418] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969418] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969418] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.969558] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1199.982860] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1199.996214] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 3265e0a4-28f2-4484-a164-4dc5af01d6ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.007819] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.043354] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.054245] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.065360] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.075891] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.086549] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance bbfb23ab-0f4d-4195-ad4f-12b405a28267 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1200.086787] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1200.086936] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1200.343617] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33d848a2-252c-4bc9-804f-0c83e5e15205 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.351504] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b75477d8-661b-4ef4-bb20-31944fc2462a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.382424] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43e95b20-5a24-441b-be2a-d22e32b8a831 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.389692] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c943f4-fd01-4fe7-8344-9f37c4631f8d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1200.402639] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1200.412753] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1200.426953] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1200.427159] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.559s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1201.401823] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1202.662062] env[60788]: DEBUG oslo_concurrency.lockutils [None req-963badbc-d090-4cb9-9b20-c632766e436c tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] Acquiring lock "778f4021-05ef-4904-864e-769e035df239" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1202.662062] env[60788]: DEBUG oslo_concurrency.lockutils [None req-963badbc-d090-4cb9-9b20-c632766e436c tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] Lock "778f4021-05ef-4904-864e-769e035df239" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1202.754844] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1203.741174] env[60788]: DEBUG oslo_concurrency.lockutils [None req-24c261cc-9f9c-4d34-8dc4-8b24ab7bc366 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "1793989e-b036-47d0-a036-5960936e145a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1203.741436] env[60788]: DEBUG oslo_concurrency.lockutils [None req-24c261cc-9f9c-4d34-8dc4-8b24ab7bc366 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "1793989e-b036-47d0-a036-5960936e145a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1210.789231] env[60788]: WARNING oslo_vmware.rw_handles [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1210.789231] env[60788]: ERROR oslo_vmware.rw_handles [ 1210.789967] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1210.792400] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1210.792631] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Copying Virtual Disk [datastore2] vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/3dd8f971-6380-435a-b7d0-975df7200a02/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1210.792923] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9539656b-de14-4aa8-b182-60c1c4bfe8cc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.801557] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1210.801557] env[60788]: value = "task-2205213" [ 1210.801557] env[60788]: _type = "Task" [ 1210.801557] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1210.809296] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205213, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1211.312615] env[60788]: DEBUG oslo_vmware.exceptions [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1211.312864] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1211.313446] env[60788]: ERROR nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1211.313446] env[60788]: Faults: ['InvalidArgument'] [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Traceback (most recent call last): [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] yield resources [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self.driver.spawn(context, instance, image_meta, [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self._fetch_image_if_missing(context, vi) [ 1211.313446] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] image_cache(vi, tmp_image_ds_loc) [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] vm_util.copy_virtual_disk( [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] session._wait_for_task(vmdk_copy_task) [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] return self.wait_for_task(task_ref) [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] return evt.wait() [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] result = hub.switch() [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1211.313731] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] return self.greenlet.switch() [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self.f(*self.args, **self.kw) [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] raise exceptions.translate_fault(task_info.error) [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Faults: ['InvalidArgument'] [ 1211.314042] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] [ 1211.314042] env[60788]: INFO nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Terminating instance [ 1211.315430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1211.315631] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1211.316950] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1211.317189] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1211.317467] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-00d7da33-8bbb-4c7e-8270-95105b652c3f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.319710] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c8fe294-119e-4b9b-9d88-2bd6fe804d3c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.326480] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1211.326697] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a7a69e42-8010-4be2-a92b-50ef7a811051 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.328803] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1211.328973] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1211.329896] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4f91b4d4-d8b6-4e60-82e2-8eff024db782 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.334608] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1211.334608] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52fa67e9-54f5-cdd5-40f1-645a130afd93" [ 1211.334608] env[60788]: _type = "Task" [ 1211.334608] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1211.347612] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52fa67e9-54f5-cdd5-40f1-645a130afd93, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1211.387899] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1211.388133] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1211.388307] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleting the datastore file [datastore2] d0480645-be38-48de-9ae5-05c4eb0bf5d3 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1211.388582] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc49f75e-b124-4d79-a394-be6d50cbe122 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.395059] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1211.395059] env[60788]: value = "task-2205215" [ 1211.395059] env[60788]: _type = "Task" [ 1211.395059] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1211.402515] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205215, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1211.847774] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1211.847774] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating directory with path [datastore2] vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1211.847774] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5eeb7f17-b10f-4a06-b320-79eb9306b949 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.856748] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Created directory with path [datastore2] vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1211.856748] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Fetch image to [datastore2] vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1211.856941] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1211.857561] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b411abd-ebe2-44bf-86a0-f33b27c1bd33 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.863677] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a0b281-8ad3-4751-9933-8d1a2a9ffa7f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.872359] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eba1c608-452a-4120-9426-da50e2efca58 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.905182] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa5a44cf-1860-410d-a11d-901489988436 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.911991] env[60788]: DEBUG oslo_vmware.api [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205215, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08019} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1211.913408] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1211.913598] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1211.913796] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1211.913940] env[60788]: INFO nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1211.915973] env[60788]: DEBUG nova.compute.claims [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1211.916158] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1211.916379] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1211.918804] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5b3ae4f7-ee42-4163-8fe8-7125d5f28813 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1211.951798] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1212.004549] env[60788]: DEBUG oslo_vmware.rw_handles [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1212.066851] env[60788]: DEBUG oslo_vmware.rw_handles [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1212.067053] env[60788]: DEBUG oslo_vmware.rw_handles [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1212.239164] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15028d4c-c7dd-4624-8000-277df5f8ef7d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.246651] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae4c8548-a613-4865-9490-878ebc0a9d61 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.277917] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef83aa2a-6c71-45ef-9eab-4e9f4c38efe4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.284744] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29873910-8aa1-49e0-92d4-40f5ef8b3bcc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.297879] env[60788]: DEBUG nova.compute.provider_tree [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1212.306316] env[60788]: DEBUG nova.scheduler.client.report [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1212.320573] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.404s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1212.321136] env[60788]: ERROR nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1212.321136] env[60788]: Faults: ['InvalidArgument'] [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Traceback (most recent call last): [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self.driver.spawn(context, instance, image_meta, [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self._fetch_image_if_missing(context, vi) [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] image_cache(vi, tmp_image_ds_loc) [ 1212.321136] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] vm_util.copy_virtual_disk( [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] session._wait_for_task(vmdk_copy_task) [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] return self.wait_for_task(task_ref) [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] return evt.wait() [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] result = hub.switch() [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] return self.greenlet.switch() [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1212.321455] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] self.f(*self.args, **self.kw) [ 1212.321790] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1212.321790] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] raise exceptions.translate_fault(task_info.error) [ 1212.321790] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1212.321790] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Faults: ['InvalidArgument'] [ 1212.321790] env[60788]: ERROR nova.compute.manager [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] [ 1212.321921] env[60788]: DEBUG nova.compute.utils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1212.324350] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Build of instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 was re-scheduled: A specified parameter was not correct: fileType [ 1212.324350] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1212.324747] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1212.324920] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1212.325108] env[60788]: DEBUG nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1212.325313] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1212.729807] env[60788]: DEBUG nova.network.neutron [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1212.741068] env[60788]: INFO nova.compute.manager [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Took 0.41 seconds to deallocate network for instance. [ 1212.851308] env[60788]: INFO nova.scheduler.client.report [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleted allocations for instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 [ 1212.877805] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5d220af3-5a8e-4ef5-9493-ce2c265d4663 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 669.022s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1212.879583] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 472.063s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1212.879987] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1212.880339] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1212.880562] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1212.883445] env[60788]: INFO nova.compute.manager [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Terminating instance [ 1212.885656] env[60788]: DEBUG nova.compute.manager [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1212.885929] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1212.886534] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-89d0cd84-fb29-4ffc-85fa-0a7b5591dbaf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.890990] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 77e5ef91-47b8-4d27-a899-8f4a910851b7] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1212.898279] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b724fd1c-34a3-4fa8-bae3-19e1c276bc1a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.919341] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 77e5ef91-47b8-4d27-a899-8f4a910851b7] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1212.929556] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d0480645-be38-48de-9ae5-05c4eb0bf5d3 could not be found. [ 1212.929747] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1212.929950] env[60788]: INFO nova.compute.manager [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1212.930209] env[60788]: DEBUG oslo.service.loopingcall [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1212.930434] env[60788]: DEBUG nova.compute.manager [-] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1212.930531] env[60788]: DEBUG nova.network.neutron [-] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1212.945679] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "77e5ef91-47b8-4d27-a899-8f4a910851b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.096s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1212.954118] env[60788]: DEBUG nova.network.neutron [-] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1212.955993] env[60788]: DEBUG nova.compute.manager [None req-f03e83e7-6092-4deb-b19c-e3b730ef6142 tempest-InstanceActionsNegativeTestJSON-758130375 tempest-InstanceActionsNegativeTestJSON-758130375-project-member] [instance: f4a6ac93-39eb-4a36-93d3-b01150092707] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1212.961624] env[60788]: INFO nova.compute.manager [-] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] Took 0.03 seconds to deallocate network for instance. [ 1212.977728] env[60788]: DEBUG nova.compute.manager [None req-f03e83e7-6092-4deb-b19c-e3b730ef6142 tempest-InstanceActionsNegativeTestJSON-758130375 tempest-InstanceActionsNegativeTestJSON-758130375-project-member] [instance: f4a6ac93-39eb-4a36-93d3-b01150092707] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1212.996438] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f03e83e7-6092-4deb-b19c-e3b730ef6142 tempest-InstanceActionsNegativeTestJSON-758130375 tempest-InstanceActionsNegativeTestJSON-758130375-project-member] Lock "f4a6ac93-39eb-4a36-93d3-b01150092707" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.386s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.004687] env[60788]: DEBUG nova.compute.manager [None req-8e7c8fdf-841f-4b79-8248-a1f51f09a489 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 027da562-4bf7-436d-bd68-af586797587a] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.026607] env[60788]: DEBUG nova.compute.manager [None req-8e7c8fdf-841f-4b79-8248-a1f51f09a489 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 027da562-4bf7-436d-bd68-af586797587a] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.042774] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c5bfb19-7578-46f1-9838-b0ee23ef8988 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.043788] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 137.292s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1213.043983] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d0480645-be38-48de-9ae5-05c4eb0bf5d3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1213.044175] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "d0480645-be38-48de-9ae5-05c4eb0bf5d3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.050419] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8e7c8fdf-841f-4b79-8248-a1f51f09a489 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "027da562-4bf7-436d-bd68-af586797587a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.941s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.080194] env[60788]: DEBUG nova.compute.manager [None req-0e2110a6-fae3-4ad5-966f-e06fad47fd2a tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] [instance: 6489648d-415a-4625-9ac6-7ee30622c8bf] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.102544] env[60788]: DEBUG nova.compute.manager [None req-0e2110a6-fae3-4ad5-966f-e06fad47fd2a tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] [instance: 6489648d-415a-4625-9ac6-7ee30622c8bf] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.123461] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e2110a6-fae3-4ad5-966f-e06fad47fd2a tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] Lock "6489648d-415a-4625-9ac6-7ee30622c8bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.887s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.132213] env[60788]: DEBUG nova.compute.manager [None req-3b9638fa-c424-42c2-b9c7-09452bbc1f0e tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] [instance: 49f64e1c-063b-4483-bd68-7423b72ea4a5] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.159170] env[60788]: DEBUG nova.compute.manager [None req-3b9638fa-c424-42c2-b9c7-09452bbc1f0e tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] [instance: 49f64e1c-063b-4483-bd68-7423b72ea4a5] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.180970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3b9638fa-c424-42c2-b9c7-09452bbc1f0e tempest-ServerShowV247Test-1497902182 tempest-ServerShowV247Test-1497902182-project-member] Lock "49f64e1c-063b-4483-bd68-7423b72ea4a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.912s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.190667] env[60788]: DEBUG nova.compute.manager [None req-edbb33ae-4359-4d10-8cb7-f86cef7c0a78 tempest-ServersNegativeTestMultiTenantJSON-929736961 tempest-ServersNegativeTestMultiTenantJSON-929736961-project-member] [instance: 0ea664eb-1978-4725-b8a5-75ce53f0d165] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.214221] env[60788]: DEBUG nova.compute.manager [None req-edbb33ae-4359-4d10-8cb7-f86cef7c0a78 tempest-ServersNegativeTestMultiTenantJSON-929736961 tempest-ServersNegativeTestMultiTenantJSON-929736961-project-member] [instance: 0ea664eb-1978-4725-b8a5-75ce53f0d165] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.234362] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edbb33ae-4359-4d10-8cb7-f86cef7c0a78 tempest-ServersNegativeTestMultiTenantJSON-929736961 tempest-ServersNegativeTestMultiTenantJSON-929736961-project-member] Lock "0ea664eb-1978-4725-b8a5-75ce53f0d165" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.252s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.242862] env[60788]: DEBUG nova.compute.manager [None req-cfda5757-27b3-4205-8301-d99854360996 tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] [instance: 2da6479b-4b3a-4d7d-91cb-81b563b11732] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.264268] env[60788]: DEBUG nova.compute.manager [None req-cfda5757-27b3-4205-8301-d99854360996 tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] [instance: 2da6479b-4b3a-4d7d-91cb-81b563b11732] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.283764] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cfda5757-27b3-4205-8301-d99854360996 tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] Lock "2da6479b-4b3a-4d7d-91cb-81b563b11732" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.098s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.292196] env[60788]: DEBUG nova.compute.manager [None req-8de8c1bc-e269-4146-a502-f028313daa01 tempest-ServerActionsTestJSON-2104909477 tempest-ServerActionsTestJSON-2104909477-project-member] [instance: 188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.313726] env[60788]: DEBUG nova.compute.manager [None req-8de8c1bc-e269-4146-a502-f028313daa01 tempest-ServerActionsTestJSON-2104909477 tempest-ServerActionsTestJSON-2104909477-project-member] [instance: 188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.333841] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8de8c1bc-e269-4146-a502-f028313daa01 tempest-ServerActionsTestJSON-2104909477 tempest-ServerActionsTestJSON-2104909477-project-member] Lock "188b4caf-70f7-4a9a-9cdd-4e1d80d81ab1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.998s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.341882] env[60788]: DEBUG nova.compute.manager [None req-42a0f89b-848a-4c32-9cfe-ae958be870db tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.366062] env[60788]: DEBUG nova.compute.manager [None req-42a0f89b-848a-4c32-9cfe-ae958be870db tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.386877] env[60788]: DEBUG oslo_concurrency.lockutils [None req-42a0f89b-848a-4c32-9cfe-ae958be870db tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "4071ed6c-f611-4b6d-a6eb-f62d5ac0ab93" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.576s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.398147] env[60788]: DEBUG nova.compute.manager [None req-d9155b2b-84ad-4132-9a59-af367d8be14a tempest-ServerMetadataNegativeTestJSON-1034376198 tempest-ServerMetadataNegativeTestJSON-1034376198-project-member] [instance: 3265e0a4-28f2-4484-a164-4dc5af01d6ad] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.421077] env[60788]: DEBUG nova.compute.manager [None req-d9155b2b-84ad-4132-9a59-af367d8be14a tempest-ServerMetadataNegativeTestJSON-1034376198 tempest-ServerMetadataNegativeTestJSON-1034376198-project-member] [instance: 3265e0a4-28f2-4484-a164-4dc5af01d6ad] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1213.443033] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d9155b2b-84ad-4132-9a59-af367d8be14a tempest-ServerMetadataNegativeTestJSON-1034376198 tempest-ServerMetadataNegativeTestJSON-1034376198-project-member] Lock "3265e0a4-28f2-4484-a164-4dc5af01d6ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.268s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.453999] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1213.502512] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1213.502772] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1213.504407] env[60788]: INFO nova.compute.claims [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1213.741144] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b33e7115-0ab4-438c-99d7-7383930523a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.748971] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5e3789e-bfb0-412e-b511-997e0385e5cc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.778102] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0528e6c-c1d0-49eb-b940-41f5039ecd80 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.785105] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a98ae039-918b-48ec-8d36-44961baf071b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.798937] env[60788]: DEBUG nova.compute.provider_tree [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1213.807272] env[60788]: DEBUG nova.scheduler.client.report [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1213.821752] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.822246] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1213.860863] env[60788]: DEBUG nova.compute.utils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1213.862171] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1213.862347] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1213.872613] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1213.924964] env[60788]: DEBUG nova.policy [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9392b1b266b347099c744d45eb8ab0d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '17ac9c63cf7e4db9bc6473c6a526d7c3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1213.942260] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1213.969465] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1213.969709] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1213.969896] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1213.970114] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1213.970264] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1213.970410] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1213.970620] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1213.970779] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1213.970945] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1213.971271] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1213.971468] env[60788]: DEBUG nova.virt.hardware [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1213.972368] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-276798e9-765f-4977-853d-5a56a92baf6c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1213.980220] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14ce4b62-dc32-4cad-8e85-a9bf8b6bf112 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1214.253375] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Successfully created port: 1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1215.341252] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Successfully updated port: 1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1215.357155] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "refresh_cache-f7fa5c24-7ff5-4656-897f-b0164c989207" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1215.357368] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquired lock "refresh_cache-f7fa5c24-7ff5-4656-897f-b0164c989207" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1215.357578] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1215.361215] env[60788]: DEBUG nova.compute.manager [req-6edff083-6a64-46c8-84bd-cee63cd0133b req-3bbc1e9c-f3ec-416a-9681-5ded6f6d7489 service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Received event network-vif-plugged-1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1215.361489] env[60788]: DEBUG oslo_concurrency.lockutils [req-6edff083-6a64-46c8-84bd-cee63cd0133b req-3bbc1e9c-f3ec-416a-9681-5ded6f6d7489 service nova] Acquiring lock "f7fa5c24-7ff5-4656-897f-b0164c989207-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1215.361761] env[60788]: DEBUG oslo_concurrency.lockutils [req-6edff083-6a64-46c8-84bd-cee63cd0133b req-3bbc1e9c-f3ec-416a-9681-5ded6f6d7489 service nova] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1215.361990] env[60788]: DEBUG oslo_concurrency.lockutils [req-6edff083-6a64-46c8-84bd-cee63cd0133b req-3bbc1e9c-f3ec-416a-9681-5ded6f6d7489 service nova] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1215.362232] env[60788]: DEBUG nova.compute.manager [req-6edff083-6a64-46c8-84bd-cee63cd0133b req-3bbc1e9c-f3ec-416a-9681-5ded6f6d7489 service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] No waiting events found dispatching network-vif-plugged-1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1215.362462] env[60788]: WARNING nova.compute.manager [req-6edff083-6a64-46c8-84bd-cee63cd0133b req-3bbc1e9c-f3ec-416a-9681-5ded6f6d7489 service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Received unexpected event network-vif-plugged-1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 for instance with vm_state building and task_state spawning. [ 1215.421150] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1215.577075] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Updating instance_info_cache with network_info: [{"id": "1bdbb17d-b96f-4305-94f6-eeccf1c81ed0", "address": "fa:16:3e:33:5e:84", "network": {"id": "f0fec424-bc31-4da2-9782-2763f5eecb37", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-62004294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "17ac9c63cf7e4db9bc6473c6a526d7c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c7d2575f-b92f-44ec-a863-634cb76631a2", "external-id": "nsx-vlan-transportzone-794", "segmentation_id": 794, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1bdbb17d-b9", "ovs_interfaceid": "1bdbb17d-b96f-4305-94f6-eeccf1c81ed0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1215.590614] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Releasing lock "refresh_cache-f7fa5c24-7ff5-4656-897f-b0164c989207" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1215.590901] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance network_info: |[{"id": "1bdbb17d-b96f-4305-94f6-eeccf1c81ed0", "address": "fa:16:3e:33:5e:84", "network": {"id": "f0fec424-bc31-4da2-9782-2763f5eecb37", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-62004294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "17ac9c63cf7e4db9bc6473c6a526d7c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c7d2575f-b92f-44ec-a863-634cb76631a2", "external-id": "nsx-vlan-transportzone-794", "segmentation_id": 794, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1bdbb17d-b9", "ovs_interfaceid": "1bdbb17d-b96f-4305-94f6-eeccf1c81ed0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1215.591296] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:5e:84', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c7d2575f-b92f-44ec-a863-634cb76631a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1bdbb17d-b96f-4305-94f6-eeccf1c81ed0', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1215.599119] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Creating folder: Project (17ac9c63cf7e4db9bc6473c6a526d7c3). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1215.599673] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-881708db-bbcd-4fa7-853f-937d39383862 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.610568] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Created folder: Project (17ac9c63cf7e4db9bc6473c6a526d7c3) in parent group-v449747. [ 1215.610748] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Creating folder: Instances. Parent ref: group-v449817. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1215.610964] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce0d64ed-36de-4f55-805e-48add08eea1b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.621958] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Created folder: Instances in parent group-v449817. [ 1215.622204] env[60788]: DEBUG oslo.service.loopingcall [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1215.622379] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1215.622568] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d999f33f-9591-4c8f-9e25-5e316829ddca {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.640774] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1215.640774] env[60788]: value = "task-2205218" [ 1215.640774] env[60788]: _type = "Task" [ 1215.640774] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.647765] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205218, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.150163] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205218, 'name': CreateVM_Task, 'duration_secs': 0.271623} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1216.150327] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1216.151011] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1216.151188] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1216.151496] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1216.151751] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4f4e934-903c-4178-bc76-f5823c5aee10 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.156130] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Waiting for the task: (returnval){ [ 1216.156130] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]526d8a5a-763d-e946-acae-292b522dde64" [ 1216.156130] env[60788]: _type = "Task" [ 1216.156130] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1216.165176] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]526d8a5a-763d-e946-acae-292b522dde64, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.666766] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1216.667152] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1216.667272] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1217.384471] env[60788]: DEBUG nova.compute.manager [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Received event network-changed-1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1217.384573] env[60788]: DEBUG nova.compute.manager [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Refreshing instance network info cache due to event network-changed-1bdbb17d-b96f-4305-94f6-eeccf1c81ed0. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1217.384791] env[60788]: DEBUG oslo_concurrency.lockutils [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] Acquiring lock "refresh_cache-f7fa5c24-7ff5-4656-897f-b0164c989207" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1217.384933] env[60788]: DEBUG oslo_concurrency.lockutils [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] Acquired lock "refresh_cache-f7fa5c24-7ff5-4656-897f-b0164c989207" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1217.385107] env[60788]: DEBUG nova.network.neutron [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Refreshing network info cache for port 1bdbb17d-b96f-4305-94f6-eeccf1c81ed0 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1217.788754] env[60788]: DEBUG nova.network.neutron [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Updated VIF entry in instance network info cache for port 1bdbb17d-b96f-4305-94f6-eeccf1c81ed0. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1217.789137] env[60788]: DEBUG nova.network.neutron [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Updating instance_info_cache with network_info: [{"id": "1bdbb17d-b96f-4305-94f6-eeccf1c81ed0", "address": "fa:16:3e:33:5e:84", "network": {"id": "f0fec424-bc31-4da2-9782-2763f5eecb37", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-62004294-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "17ac9c63cf7e4db9bc6473c6a526d7c3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c7d2575f-b92f-44ec-a863-634cb76631a2", "external-id": "nsx-vlan-transportzone-794", "segmentation_id": 794, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1bdbb17d-b9", "ovs_interfaceid": "1bdbb17d-b96f-4305-94f6-eeccf1c81ed0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1217.800595] env[60788]: DEBUG oslo_concurrency.lockutils [req-2ea4fbb4-365c-49cf-b947-7981c2985dad req-d6bdff22-45b7-4235-b4f7-642d6f9bf2fd service nova] Releasing lock "refresh_cache-f7fa5c24-7ff5-4656-897f-b0164c989207" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1237.666190] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "f7fa5c24-7ff5-4656-897f-b0164c989207" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1256.754192] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1258.754744] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1259.754584] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1259.754863] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1260.749579] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1260.753198] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1260.753360] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1260.753483] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1260.775183] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.775462] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.775462] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.775584] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.775707] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.775829] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.775952] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.776088] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.776211] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.776346] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1260.776460] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1260.776913] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1261.754048] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1261.754367] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1261.765893] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1261.766157] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1261.766352] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1261.766529] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1261.767758] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99efe862-a1d8-45da-af87-670120d7d80f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.777274] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1736045d-da41-4278-900e-6eb6c2d90b34 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.792144] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbda8e68-bf21-4fde-a4ae-0981b21f1ad2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.798396] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-455feee5-381b-4204-a495-1d19eeb94947 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.827053] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181261MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1261.827203] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1261.827391] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1261.880944] env[60788]: WARNING oslo_vmware.rw_handles [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1261.880944] env[60788]: ERROR oslo_vmware.rw_handles [ 1261.881423] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1261.883358] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1261.883600] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Copying Virtual Disk [datastore2] vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/14f1882d-625c-4df8-a4fa-f1f49478c3c3/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1261.884098] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fb0487dc-7935-4309-be83-f482c79d2231 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1261.893119] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1261.893119] env[60788]: value = "task-2205219" [ 1261.893119] env[60788]: _type = "Task" [ 1261.893119] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1261.901226] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': task-2205219, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1261.902081] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 01821598-4692-440b-8128-c50e359386e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902241] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902362] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902481] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902601] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902716] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902833] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.902948] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.903078] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.903194] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1261.914022] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.924743] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.934143] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.944187] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.954271] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance bbfb23ab-0f4d-4195-ad4f-12b405a28267 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.963540] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 778f4021-05ef-4904-864e-769e035df239 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.972807] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1793989e-b036-47d0-a036-5960936e145a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1261.973041] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1261.973195] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1262.147738] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fef2c6ee-edef-4784-aafa-84fda31ff063 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.155054] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb877dec-6cdb-4dec-ba62-b1500e2ead48 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.185573] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07ec32fd-7575-479f-89c7-799f222ad442 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.192712] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9db70353-f912-42b5-aef8-188cd9f54f9b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.205657] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1262.214753] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1262.231540] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1262.231720] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.404s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1262.404363] env[60788]: DEBUG oslo_vmware.exceptions [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1262.404607] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1262.405107] env[60788]: ERROR nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1262.405107] env[60788]: Faults: ['InvalidArgument'] [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] Traceback (most recent call last): [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] yield resources [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self.driver.spawn(context, instance, image_meta, [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self._fetch_image_if_missing(context, vi) [ 1262.405107] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] image_cache(vi, tmp_image_ds_loc) [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] vm_util.copy_virtual_disk( [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] session._wait_for_task(vmdk_copy_task) [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] return self.wait_for_task(task_ref) [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] return evt.wait() [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] result = hub.switch() [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1262.405582] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] return self.greenlet.switch() [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self.f(*self.args, **self.kw) [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] raise exceptions.translate_fault(task_info.error) [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] Faults: ['InvalidArgument'] [ 1262.405964] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] [ 1262.405964] env[60788]: INFO nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Terminating instance [ 1262.406932] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1262.407146] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1262.407390] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d978bddb-6904-4200-b28e-e1f6a89ddf85 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.410827] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1262.411084] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1262.411915] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-914873be-e5da-4c57-916c-5779bd1ff0b2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.419457] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1262.419708] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5657e86b-8e32-4ecb-933d-6ed3b5fb15f7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.422235] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1262.422450] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1262.423523] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8cba3543-ac5e-415f-9837-081e481ff3f8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.429065] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Waiting for the task: (returnval){ [ 1262.429065] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52985d4a-3bac-bc4f-fdd1-4ac83b921749" [ 1262.429065] env[60788]: _type = "Task" [ 1262.429065] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1262.436490] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52985d4a-3bac-bc4f-fdd1-4ac83b921749, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1262.490614] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1262.490837] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1262.491025] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Deleting the datastore file [datastore2] 01821598-4692-440b-8128-c50e359386e2 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1262.491280] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b61874ce-9ef5-4bc4-8244-0ef3cfbe1c72 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.497350] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1262.497350] env[60788]: value = "task-2205221" [ 1262.497350] env[60788]: _type = "Task" [ 1262.497350] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1262.504842] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': task-2205221, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1262.940028] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1262.940384] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Creating directory with path [datastore2] vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1262.940503] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-23aeb085-0d11-43a1-b120-bf7ffcbb1a5e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.951488] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Created directory with path [datastore2] vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1262.951675] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Fetch image to [datastore2] vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1262.951825] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1262.952540] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f4bf457-4dc9-4738-8327-ddb799e68122 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.958917] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa26d76d-7505-4112-9aef-b372577a44f2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1262.968766] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e2f96c8-fbd7-4975-8e09-906f9bb999bb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.002023] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86883e15-303c-499f-96c4-f6f3e2557072 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.009084] env[60788]: DEBUG oslo_vmware.api [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': task-2205221, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.090498} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1263.010512] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1263.010702] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1263.010870] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1263.011054] env[60788]: INFO nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1263.012726] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f4ae027b-6bdf-43a6-a7b5-181d19c5a9d0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.014554] env[60788]: DEBUG nova.compute.claims [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1263.014766] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1263.014930] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1263.037608] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1263.213266] env[60788]: DEBUG oslo_vmware.rw_handles [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1263.273026] env[60788]: DEBUG oslo_vmware.rw_handles [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1263.273283] env[60788]: DEBUG oslo_vmware.rw_handles [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1263.303778] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ae26d49-6047-4add-95cd-dd936e172113 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.311762] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1ca1b62-d3f3-4d30-acdb-e6c82270b338 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.342622] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7884ded1-49b3-4b45-bd29-68840856fc84 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.349736] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1a0908c-1f20-4f03-a2d7-fb0f7162ee02 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1263.362646] env[60788]: DEBUG nova.compute.provider_tree [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1263.371334] env[60788]: DEBUG nova.scheduler.client.report [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1263.385081] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.370s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1263.385630] env[60788]: ERROR nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1263.385630] env[60788]: Faults: ['InvalidArgument'] [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] Traceback (most recent call last): [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self.driver.spawn(context, instance, image_meta, [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self._fetch_image_if_missing(context, vi) [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] image_cache(vi, tmp_image_ds_loc) [ 1263.385630] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] vm_util.copy_virtual_disk( [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] session._wait_for_task(vmdk_copy_task) [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] return self.wait_for_task(task_ref) [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] return evt.wait() [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] result = hub.switch() [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] return self.greenlet.switch() [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1263.385951] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] self.f(*self.args, **self.kw) [ 1263.386351] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1263.386351] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] raise exceptions.translate_fault(task_info.error) [ 1263.386351] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1263.386351] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] Faults: ['InvalidArgument'] [ 1263.386351] env[60788]: ERROR nova.compute.manager [instance: 01821598-4692-440b-8128-c50e359386e2] [ 1263.386351] env[60788]: DEBUG nova.compute.utils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1263.387676] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Build of instance 01821598-4692-440b-8128-c50e359386e2 was re-scheduled: A specified parameter was not correct: fileType [ 1263.387676] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1263.388075] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1263.388251] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1263.388444] env[60788]: DEBUG nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1263.388610] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1263.746733] env[60788]: DEBUG nova.network.neutron [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1263.760507] env[60788]: INFO nova.compute.manager [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Took 0.37 seconds to deallocate network for instance. [ 1263.871383] env[60788]: INFO nova.scheduler.client.report [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Deleted allocations for instance 01821598-4692-440b-8128-c50e359386e2 [ 1263.899301] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0ed14e87-cf4e-4a51-9ec2-b8735ecb1a29 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "01821598-4692-440b-8128-c50e359386e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 686.583s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1263.899301] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "01821598-4692-440b-8128-c50e359386e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 487.424s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1263.899301] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "01821598-4692-440b-8128-c50e359386e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1263.899468] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "01821598-4692-440b-8128-c50e359386e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1263.899468] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "01821598-4692-440b-8128-c50e359386e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1263.901050] env[60788]: INFO nova.compute.manager [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Terminating instance [ 1263.902715] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1263.903993] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1263.903993] env[60788]: DEBUG nova.network.neutron [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1263.915824] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1263.940486] env[60788]: DEBUG nova.network.neutron [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1263.974182] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1263.974441] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1263.976111] env[60788]: INFO nova.compute.claims [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1264.148714] env[60788]: DEBUG nova.network.neutron [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1264.158943] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "refresh_cache-01821598-4692-440b-8128-c50e359386e2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1264.159483] env[60788]: DEBUG nova.compute.manager [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1264.159688] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1264.160342] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-52cd0793-ba14-4979-809f-de4508268a87 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.168968] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75d2366e-2fab-4d5c-812e-2ed2c894c0d8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.203685] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 01821598-4692-440b-8128-c50e359386e2 could not be found. [ 1264.203902] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1264.204105] env[60788]: INFO nova.compute.manager [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 01821598-4692-440b-8128-c50e359386e2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1264.204354] env[60788]: DEBUG oslo.service.loopingcall [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1264.204781] env[60788]: DEBUG nova.compute.manager [-] [instance: 01821598-4692-440b-8128-c50e359386e2] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1264.204887] env[60788]: DEBUG nova.network.neutron [-] [instance: 01821598-4692-440b-8128-c50e359386e2] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1264.221766] env[60788]: DEBUG nova.network.neutron [-] [instance: 01821598-4692-440b-8128-c50e359386e2] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1264.229447] env[60788]: DEBUG nova.network.neutron [-] [instance: 01821598-4692-440b-8128-c50e359386e2] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1264.230821] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1264.237778] env[60788]: INFO nova.compute.manager [-] [instance: 01821598-4692-440b-8128-c50e359386e2] Took 0.03 seconds to deallocate network for instance. [ 1264.256346] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95146e10-c75f-4271-88c3-edd121e6c53e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.265412] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d21a6811-41ac-48ee-8dc8-c01fc61aaeb1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.301238] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68e73a95-e559-4a0d-b305-c95b0a95df2c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.308940] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1450082-0536-4164-86bd-063c3df62ab7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.324276] env[60788]: DEBUG nova.compute.provider_tree [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1264.332615] env[60788]: DEBUG nova.scheduler.client.report [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1264.347465] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1264.347971] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1264.363947] env[60788]: DEBUG oslo_concurrency.lockutils [None req-552192fe-dd88-447f-b195-cd23ffd55d59 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "01821598-4692-440b-8128-c50e359386e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.466s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1264.364746] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "01821598-4692-440b-8128-c50e359386e2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 188.613s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1264.364928] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 01821598-4692-440b-8128-c50e359386e2] During sync_power_state the instance has a pending task (deleting). Skip. [ 1264.365133] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "01821598-4692-440b-8128-c50e359386e2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1264.380254] env[60788]: DEBUG nova.compute.utils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1264.381397] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Not allocating networking since 'none' was specified. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1953}} [ 1264.388703] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1264.450917] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1264.476498] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1264.476705] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1264.476858] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1264.477051] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1264.477203] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1264.477349] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1264.477553] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1264.477708] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1264.477870] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1264.478272] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1264.478457] env[60788]: DEBUG nova.virt.hardware [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1264.479304] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50695367-2f93-4e89-adcd-58e9aafc762f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.487273] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50f6d357-ba5f-4479-bd47-0e193bee3f48 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.500539] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance VIF info [] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1264.505937] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Creating folder: Project (418df53de62f42ab8032f2c9b929935c). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1264.506203] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d14d224c-559a-4df8-b48f-09e2faced754 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.515833] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Created folder: Project (418df53de62f42ab8032f2c9b929935c) in parent group-v449747. [ 1264.516040] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Creating folder: Instances. Parent ref: group-v449820. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1264.516251] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-efc397b2-3637-4093-a67d-255eb6ed2a9b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.524510] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Created folder: Instances in parent group-v449820. [ 1264.524732] env[60788]: DEBUG oslo.service.loopingcall [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1264.524903] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1264.525104] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1d87f772-3c25-42b6-87cd-fb790796a741 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1264.540486] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1264.540486] env[60788]: value = "task-2205224" [ 1264.540486] env[60788]: _type = "Task" [ 1264.540486] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1264.547467] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205224, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1265.050560] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205224, 'name': CreateVM_Task, 'duration_secs': 0.272601} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1265.051556] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1265.051556] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1265.051707] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1265.052031] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1265.052294] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-096e1a11-ff82-4c72-b81a-6035a5da203b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.056547] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for the task: (returnval){ [ 1265.056547] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525a42e3-119e-d29e-7c15-01eb687a6b4a" [ 1265.056547] env[60788]: _type = "Task" [ 1265.056547] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1265.063784] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]525a42e3-119e-d29e-7c15-01eb687a6b4a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1265.566879] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1265.567146] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1265.567364] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1274.115270] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1274.115599] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1312.428380] env[60788]: WARNING oslo_vmware.rw_handles [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1312.428380] env[60788]: ERROR oslo_vmware.rw_handles [ 1312.429028] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1312.431128] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1312.431428] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Copying Virtual Disk [datastore2] vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/88b2dc68-52f3-4acf-8ab4-87baa74a6c9e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1312.431769] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c8f2c8e0-f3cd-4807-8dc6-8c24752c0870 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1312.441971] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Waiting for the task: (returnval){ [ 1312.441971] env[60788]: value = "task-2205225" [ 1312.441971] env[60788]: _type = "Task" [ 1312.441971] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1312.450144] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Task: {'id': task-2205225, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1312.951924] env[60788]: DEBUG oslo_vmware.exceptions [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1312.952223] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1312.952769] env[60788]: ERROR nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1312.952769] env[60788]: Faults: ['InvalidArgument'] [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Traceback (most recent call last): [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] yield resources [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self.driver.spawn(context, instance, image_meta, [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self._fetch_image_if_missing(context, vi) [ 1312.952769] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] image_cache(vi, tmp_image_ds_loc) [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] vm_util.copy_virtual_disk( [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] session._wait_for_task(vmdk_copy_task) [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] return self.wait_for_task(task_ref) [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] return evt.wait() [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] result = hub.switch() [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1312.953155] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] return self.greenlet.switch() [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self.f(*self.args, **self.kw) [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] raise exceptions.translate_fault(task_info.error) [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Faults: ['InvalidArgument'] [ 1312.953522] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] [ 1312.953522] env[60788]: INFO nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Terminating instance [ 1312.954624] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1312.954840] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1312.955091] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8349231b-f2ca-4be7-8fe3-d0776d96cff4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1312.957196] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1312.957389] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1312.958108] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-562a2780-b8ba-4149-baf5-73cb9c65305b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1312.964907] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1312.965126] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-55b3e67c-a29e-4d95-8b53-04ed786fb28e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1312.967191] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1312.967361] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1312.968275] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c2245c7e-3168-4194-82fe-769dbb15a17b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1312.972669] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Waiting for the task: (returnval){ [ 1312.972669] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52e70cad-cb1f-91b0-9c16-f82bf5ecf34c" [ 1312.972669] env[60788]: _type = "Task" [ 1312.972669] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1312.984176] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52e70cad-cb1f-91b0-9c16-f82bf5ecf34c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1313.029298] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1313.029533] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1313.029692] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Deleting the datastore file [datastore2] fe6168fd-528f-4acb-a44c-6d0b69cada6e {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1313.029957] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ef363d90-9ce1-4505-9419-50856c7a72ac {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.036405] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Waiting for the task: (returnval){ [ 1313.036405] env[60788]: value = "task-2205227" [ 1313.036405] env[60788]: _type = "Task" [ 1313.036405] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1313.043902] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Task: {'id': task-2205227, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1313.482710] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1313.483069] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Creating directory with path [datastore2] vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1313.483181] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d0735016-03a0-4ead-a909-fff161c97358 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.494362] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Created directory with path [datastore2] vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1313.494551] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Fetch image to [datastore2] vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1313.494733] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1313.495445] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6ca6612-1167-444b-90c4-cc79c8c6378d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.501866] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bab63d8-ed9a-4028-8927-11dc11bc5af8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.510792] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf699b45-b91d-44c7-8af2-e5ebde5cb54b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.543073] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc527f89-85a0-421a-ac76-bb4b103c7ca0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.551992] env[60788]: DEBUG oslo_vmware.api [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Task: {'id': task-2205227, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075616} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1313.552473] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1313.552654] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1313.552821] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1313.552994] env[60788]: INFO nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1313.554441] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c161a9bc-cacd-4eb1-bf82-38ce5adbbf91 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.556270] env[60788]: DEBUG nova.compute.claims [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1313.556442] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1313.556716] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1313.579900] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1313.723361] env[60788]: DEBUG oslo_vmware.rw_handles [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1313.777682] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1313.783815] env[60788]: DEBUG oslo_vmware.rw_handles [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1313.783815] env[60788]: DEBUG oslo_vmware.rw_handles [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1313.847962] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e9a0cd7-3df7-42d0-b783-b66fc7a6f17c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.855313] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc021edb-21e4-4a8a-b6d4-a0739e8ff940 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.885819] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fcf6db6-53f4-4cae-bbb4-1829172a588b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.892592] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d0047f-71fe-43d4-8838-ecda4d5e1bfa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1313.905351] env[60788]: DEBUG nova.compute.provider_tree [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1313.914324] env[60788]: DEBUG nova.scheduler.client.report [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1313.929388] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.373s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1313.929893] env[60788]: ERROR nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1313.929893] env[60788]: Faults: ['InvalidArgument'] [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Traceback (most recent call last): [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self.driver.spawn(context, instance, image_meta, [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self._fetch_image_if_missing(context, vi) [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] image_cache(vi, tmp_image_ds_loc) [ 1313.929893] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] vm_util.copy_virtual_disk( [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] session._wait_for_task(vmdk_copy_task) [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] return self.wait_for_task(task_ref) [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] return evt.wait() [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] result = hub.switch() [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] return self.greenlet.switch() [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1313.930271] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] self.f(*self.args, **self.kw) [ 1313.930605] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1313.930605] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] raise exceptions.translate_fault(task_info.error) [ 1313.930605] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1313.930605] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Faults: ['InvalidArgument'] [ 1313.930605] env[60788]: ERROR nova.compute.manager [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] [ 1313.930605] env[60788]: DEBUG nova.compute.utils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1313.932333] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Build of instance fe6168fd-528f-4acb-a44c-6d0b69cada6e was re-scheduled: A specified parameter was not correct: fileType [ 1313.932333] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1313.932705] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1313.932878] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1313.933057] env[60788]: DEBUG nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1313.933218] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1314.249779] env[60788]: DEBUG nova.network.neutron [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1314.262158] env[60788]: INFO nova.compute.manager [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Took 0.33 seconds to deallocate network for instance. [ 1314.368020] env[60788]: INFO nova.scheduler.client.report [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Deleted allocations for instance fe6168fd-528f-4acb-a44c-6d0b69cada6e [ 1314.385902] env[60788]: DEBUG oslo_concurrency.lockutils [None req-90d616b7-0a53-4883-a3fd-a832e9da16bb tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 671.133s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1314.386964] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 474.743s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1314.387196] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Acquiring lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1314.387395] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1314.387557] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1314.389941] env[60788]: INFO nova.compute.manager [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Terminating instance [ 1314.391671] env[60788]: DEBUG nova.compute.manager [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1314.391865] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1314.392382] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b6c8d7ef-ecb0-4729-ab3e-29ba1906583c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.401464] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a930f23c-2e7c-4bf9-994e-0e4b48cea4c0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.412580] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1314.433068] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fe6168fd-528f-4acb-a44c-6d0b69cada6e could not be found. [ 1314.433285] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1314.433463] env[60788]: INFO nova.compute.manager [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1314.433708] env[60788]: DEBUG oslo.service.loopingcall [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1314.433941] env[60788]: DEBUG nova.compute.manager [-] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1314.434051] env[60788]: DEBUG nova.network.neutron [-] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1314.464472] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1314.464733] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1314.466195] env[60788]: INFO nova.compute.claims [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1314.477161] env[60788]: DEBUG nova.network.neutron [-] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1314.487173] env[60788]: INFO nova.compute.manager [-] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] Took 0.05 seconds to deallocate network for instance. [ 1314.588936] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a6e2530f-764b-4f0c-962d-e5bc5d51015d tempest-ServerActionsTestOtherB-310570437 tempest-ServerActionsTestOtherB-310570437-project-member] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.202s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1314.589807] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 238.838s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1314.589994] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fe6168fd-528f-4acb-a44c-6d0b69cada6e] During sync_power_state the instance has a pending task (deleting). Skip. [ 1314.590182] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "fe6168fd-528f-4acb-a44c-6d0b69cada6e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1314.707721] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbb86bd7-2ec8-4fdf-8214-72373fba2cce {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.715249] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eac9d01-febf-450e-a8da-c5f3d5255ed9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.746227] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-281a2bc8-c2c3-41a7-ade0-cd70677742d2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.752973] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc1d524e-b2b3-4e84-b0f4-976305cafa34 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.765647] env[60788]: DEBUG nova.compute.provider_tree [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1314.773998] env[60788]: DEBUG nova.scheduler.client.report [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1314.786548] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1314.787052] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1314.822286] env[60788]: DEBUG nova.compute.utils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1314.823397] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1314.823568] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1314.831273] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1314.890826] env[60788]: DEBUG nova.policy [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89b673319ed34de9859c0f58f1c616c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d4606e74dad40acba2d78ea01a69919', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1314.893814] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1314.920208] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1314.920452] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1314.920606] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1314.920803] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1314.920969] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1314.921129] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1314.921342] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1314.921505] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1314.921671] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1314.921831] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1314.922025] env[60788]: DEBUG nova.virt.hardware [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1314.922874] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d0287c5-0382-4e26-989c-14252fb616dd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1314.931207] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b313bb-97a2-4b49-8d70-147c7ad8970a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.184577] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Successfully created port: dea8c02f-4d8a-425f-8805-136a4fb2b0f0 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1316.154812] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "d63f9834-818b-4087-851c-d7394d20b89d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1316.234273] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Successfully updated port: dea8c02f-4d8a-425f-8805-136a4fb2b0f0 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1316.247055] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "refresh_cache-688ff077-9505-48f5-9117-0a7f115f254c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1316.247245] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "refresh_cache-688ff077-9505-48f5-9117-0a7f115f254c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1316.247398] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1316.286041] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1316.438630] env[60788]: DEBUG nova.compute.manager [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Received event network-vif-plugged-dea8c02f-4d8a-425f-8805-136a4fb2b0f0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1316.438862] env[60788]: DEBUG oslo_concurrency.lockutils [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] Acquiring lock "688ff077-9505-48f5-9117-0a7f115f254c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1316.439078] env[60788]: DEBUG oslo_concurrency.lockutils [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] Lock "688ff077-9505-48f5-9117-0a7f115f254c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1316.439327] env[60788]: DEBUG oslo_concurrency.lockutils [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] Lock "688ff077-9505-48f5-9117-0a7f115f254c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1316.439610] env[60788]: DEBUG nova.compute.manager [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] No waiting events found dispatching network-vif-plugged-dea8c02f-4d8a-425f-8805-136a4fb2b0f0 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1316.439610] env[60788]: WARNING nova.compute.manager [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Received unexpected event network-vif-plugged-dea8c02f-4d8a-425f-8805-136a4fb2b0f0 for instance with vm_state building and task_state spawning. [ 1316.439695] env[60788]: DEBUG nova.compute.manager [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Received event network-changed-dea8c02f-4d8a-425f-8805-136a4fb2b0f0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1316.439843] env[60788]: DEBUG nova.compute.manager [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Refreshing instance network info cache due to event network-changed-dea8c02f-4d8a-425f-8805-136a4fb2b0f0. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1316.439995] env[60788]: DEBUG oslo_concurrency.lockutils [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] Acquiring lock "refresh_cache-688ff077-9505-48f5-9117-0a7f115f254c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1316.448235] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Updating instance_info_cache with network_info: [{"id": "dea8c02f-4d8a-425f-8805-136a4fb2b0f0", "address": "fa:16:3e:ee:d5:9f", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdea8c02f-4d", "ovs_interfaceid": "dea8c02f-4d8a-425f-8805-136a4fb2b0f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1317.118571] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "refresh_cache-688ff077-9505-48f5-9117-0a7f115f254c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1317.118886] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance network_info: |[{"id": "dea8c02f-4d8a-425f-8805-136a4fb2b0f0", "address": "fa:16:3e:ee:d5:9f", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdea8c02f-4d", "ovs_interfaceid": "dea8c02f-4d8a-425f-8805-136a4fb2b0f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1317.119202] env[60788]: DEBUG oslo_concurrency.lockutils [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] Acquired lock "refresh_cache-688ff077-9505-48f5-9117-0a7f115f254c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1317.119414] env[60788]: DEBUG nova.network.neutron [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Refreshing network info cache for port dea8c02f-4d8a-425f-8805-136a4fb2b0f0 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1317.124021] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ee:d5:9f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dea8c02f-4d8a-425f-8805-136a4fb2b0f0', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1317.127833] env[60788]: DEBUG oslo.service.loopingcall [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1317.131009] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1317.131461] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c86ea2b0-146c-4d3f-a57f-04cd0368672a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.151695] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1317.151695] env[60788]: value = "task-2205228" [ 1317.151695] env[60788]: _type = "Task" [ 1317.151695] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1317.159662] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205228, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1317.445867] env[60788]: DEBUG nova.network.neutron [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Updated VIF entry in instance network info cache for port dea8c02f-4d8a-425f-8805-136a4fb2b0f0. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1317.446326] env[60788]: DEBUG nova.network.neutron [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Updating instance_info_cache with network_info: [{"id": "dea8c02f-4d8a-425f-8805-136a4fb2b0f0", "address": "fa:16:3e:ee:d5:9f", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdea8c02f-4d", "ovs_interfaceid": "dea8c02f-4d8a-425f-8805-136a4fb2b0f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1317.456197] env[60788]: DEBUG oslo_concurrency.lockutils [req-14deb810-4c1e-4a97-b581-436f95d3b752 req-bee02d32-428a-40eb-94a5-5b88fcd2db95 service nova] Releasing lock "refresh_cache-688ff077-9505-48f5-9117-0a7f115f254c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1317.660947] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205228, 'name': CreateVM_Task, 'duration_secs': 0.299802} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1317.661144] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1317.661758] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1317.661920] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1317.662247] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1317.662522] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4ae1e5b5-3c87-4f81-bf1a-a13de8318cf2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.666577] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1317.666577] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5290b47d-a317-8199-aa30-083a8ca094b6" [ 1317.666577] env[60788]: _type = "Task" [ 1317.666577] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1317.673726] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5290b47d-a317-8199-aa30-083a8ca094b6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1317.753324] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1318.176966] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1318.177314] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1318.177475] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1319.754043] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1319.754043] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1320.755492] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1321.753831] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.749407] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.752987] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1322.753158] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1322.753277] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1322.775817] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.775967] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776119] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776246] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776368] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776571] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776610] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776728] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776844] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.776960] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1322.777089] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1323.753419] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1323.753735] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1323.764761] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1323.764978] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1323.765149] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1323.765303] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1323.766386] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbe33102-eefb-4f81-8ba0-8fcb41d94fdf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.775152] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33809bb8-5f96-4e33-b053-9f22af91642b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.789973] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b16e6d94-73fb-4c42-ba84-671078513f28 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.796239] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-722371ab-25d0-4dbe-ba62-a992a98abfd0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1323.824758] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181270MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1323.825091] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1323.825184] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1323.899022] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899022] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899022] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899186] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899186] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899342] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899503] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899682] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899779] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.899896] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1323.910740] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.920625] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.930390] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance bbfb23ab-0f4d-4195-ad4f-12b405a28267 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.941091] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 778f4021-05ef-4904-864e-769e035df239 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.950842] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1793989e-b036-47d0-a036-5960936e145a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.960633] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1323.960872] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1323.961059] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1324.141653] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d78a3a62-4840-48ad-b9c7-f50b7496978d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.148934] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e839bf1e-6938-421b-b18d-4a6d934d760a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.179565] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9f621a1-68cd-43f8-8a48-232e499cbd19 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.186240] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-108369c0-0af5-4745-b714-ea8121bd8479 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1324.198949] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1324.207155] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1324.220232] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1324.220374] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1325.857361] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "67c365fa-74b8-4a57-abbc-c143990a0292" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1325.857637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "67c365fa-74b8-4a57-abbc-c143990a0292" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1326.221066] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1331.977959] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "688ff077-9505-48f5-9117-0a7f115f254c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1363.000090] env[60788]: WARNING oslo_vmware.rw_handles [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1363.000090] env[60788]: ERROR oslo_vmware.rw_handles [ 1363.000838] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1363.003026] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1363.003312] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Copying Virtual Disk [datastore2] vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/ed4c8040-cf75-4791-a7ea-968730e177a5/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1363.003966] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d5f860a2-01a3-4846-9635-3cdc7115fd33 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.011836] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Waiting for the task: (returnval){ [ 1363.011836] env[60788]: value = "task-2205229" [ 1363.011836] env[60788]: _type = "Task" [ 1363.011836] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.021075] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Task: {'id': task-2205229, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.522379] env[60788]: DEBUG oslo_vmware.exceptions [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1363.522683] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1363.523251] env[60788]: ERROR nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1363.523251] env[60788]: Faults: ['InvalidArgument'] [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Traceback (most recent call last): [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] yield resources [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self.driver.spawn(context, instance, image_meta, [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self._fetch_image_if_missing(context, vi) [ 1363.523251] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] image_cache(vi, tmp_image_ds_loc) [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] vm_util.copy_virtual_disk( [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] session._wait_for_task(vmdk_copy_task) [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] return self.wait_for_task(task_ref) [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] return evt.wait() [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] result = hub.switch() [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1363.523581] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] return self.greenlet.switch() [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self.f(*self.args, **self.kw) [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] raise exceptions.translate_fault(task_info.error) [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Faults: ['InvalidArgument'] [ 1363.523893] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] [ 1363.523893] env[60788]: INFO nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Terminating instance [ 1363.525492] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1363.525705] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1363.525947] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-55444a4b-1bdd-478d-ac6a-290ff65968d1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.528325] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1363.528536] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1363.529294] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-756ce46d-fd1c-44c1-bb01-e1e0bc88b215 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.536149] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1363.536369] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ada35667-debd-468e-8121-89230cbcbb01 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.538610] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1363.538784] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1363.539766] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-62861257-4220-4ef8-a641-b989b41f525a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.544460] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 1363.544460] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]526b8a4b-3008-25b1-1ce0-23a97d1693b9" [ 1363.544460] env[60788]: _type = "Task" [ 1363.544460] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.555134] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]526b8a4b-3008-25b1-1ce0-23a97d1693b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1363.616898] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1363.617190] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1363.617403] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Deleting the datastore file [datastore2] af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1363.617704] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ec829461-321f-44a3-94aa-c31446a12690 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1363.624181] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Waiting for the task: (returnval){ [ 1363.624181] env[60788]: value = "task-2205231" [ 1363.624181] env[60788]: _type = "Task" [ 1363.624181] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1363.632589] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Task: {'id': task-2205231, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.055169] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1364.055546] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Creating directory with path [datastore2] vmware_temp/d88dbd32-4b95-45be-a66a-b7e06c142bee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1364.055657] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f74487c8-e320-4232-8539-5ce26fa7d69c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.067257] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Created directory with path [datastore2] vmware_temp/d88dbd32-4b95-45be-a66a-b7e06c142bee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1364.067472] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Fetch image to [datastore2] vmware_temp/d88dbd32-4b95-45be-a66a-b7e06c142bee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1364.067591] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/d88dbd32-4b95-45be-a66a-b7e06c142bee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1364.068357] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c411c5c2-689e-40af-a0b7-461da8d7bccb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.074782] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fce52d2e-b5eb-4ab2-b851-24dca9fb7c6e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.083645] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab80e485-e3c9-4b78-a71c-785c1579f8c5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.113825] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7e3cfdb-cde1-46c5-81c9-3f641017eb2f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.119162] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1663d8f9-f9ba-44c0-af33-d0983872373a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.133093] env[60788]: DEBUG oslo_vmware.api [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Task: {'id': task-2205231, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.102826} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1364.133320] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1364.133489] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1364.133649] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1364.133817] env[60788]: INFO nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1364.135868] env[60788]: DEBUG nova.compute.claims [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1364.136043] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1364.136261] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1364.140129] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1364.268707] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1364.270450] env[60788]: ERROR nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = getattr(controller, method)(*args, **kwargs) [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._get(image_id) [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1364.270450] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] resp, body = self.http_client.get(url, headers=header) [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.request(url, 'GET', **kwargs) [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._handle_response(resp) [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exc.from_response(resp, resp.content) [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1364.271083] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] yield resources [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self.driver.spawn(context, instance, image_meta, [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._fetch_image_if_missing(context, vi) [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image_fetch(context, vi, tmp_image_ds_loc) [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] images.fetch_image( [ 1364.271624] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] metadata = IMAGE_API.get(context, image_ref) [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return session.show(context, image_id, [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] _reraise_translated_image_exception(image_id) [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise new_exc.with_traceback(exc_trace) [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = getattr(controller, method)(*args, **kwargs) [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1364.271963] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._get(image_id) [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] resp, body = self.http_client.get(url, headers=header) [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.request(url, 'GET', **kwargs) [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._handle_response(resp) [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exc.from_response(resp, resp.content) [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] nova.exception.ImageNotAuthorized: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. [ 1364.272336] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1364.272641] env[60788]: INFO nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Terminating instance [ 1364.272641] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1364.272641] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1364.272945] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1170989-26d9-48c0-9a00-a6ed6930752a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.275737] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1364.275923] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1364.276840] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8814b2d7-f50f-4f6e-a95c-0bbe8441f684 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.282621] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1364.282793] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1364.283930] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fcaf16f4-8f66-43c0-ad4c-f4d15172098c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.287968] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1364.288794] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6c974cd8-5774-4312-818e-e74437c01708 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.291145] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Waiting for the task: (returnval){ [ 1364.291145] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]521757e3-d489-e3b0-328c-424211ed2f99" [ 1364.291145] env[60788]: _type = "Task" [ 1364.291145] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1364.303687] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1364.303909] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Creating directory with path [datastore2] vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1364.304122] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2156a385-3d6b-4e85-aeb6-b1d3d839c364 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.324325] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Created directory with path [datastore2] vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1364.324536] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Fetch image to [datastore2] vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1364.324773] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1364.325617] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1aeb071-20ee-4a48-8f46-a27900fab4a3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.334932] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21ae8e71-d73a-4f71-91c3-116dee84b2b3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.346381] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3bb360a-d976-4794-a8e0-85c5a8655ac8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.383869] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d1ebfe1-86e5-479e-9d9b-08ba5ecd0c7a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.386832] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1364.386832] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1364.386832] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Deleting the datastore file [datastore2] 5c7c0b6d-d4ea-4c78-8a76-934859d6571e {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1364.387084] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-255e3f06-19e5-41db-8386-00d8345701dd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.393073] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b1827b8e-85c1-4abd-81c2-e67daa29860b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.394789] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for the task: (returnval){ [ 1364.394789] env[60788]: value = "task-2205233" [ 1364.394789] env[60788]: _type = "Task" [ 1364.394789] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1364.405835] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': task-2205233, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1364.417900] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1364.436041] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37ce4634-e2b1-4ddc-a2f3-0e052a2c7760 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.444882] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e56470aa-b04d-437f-844a-3ffe8ad63d69 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.479845] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f78cfe54-7b08-405a-bfda-de88ad533a7e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.487048] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3adc9a95-1791-43d8-a263-58e84d275e49 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1364.500200] env[60788]: DEBUG nova.compute.provider_tree [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1364.502623] env[60788]: DEBUG oslo_vmware.rw_handles [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1364.559611] env[60788]: DEBUG nova.scheduler.client.report [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1364.564737] env[60788]: DEBUG oslo_vmware.rw_handles [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1364.564917] env[60788]: DEBUG oslo_vmware.rw_handles [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1364.580693] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.444s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1364.581289] env[60788]: ERROR nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1364.581289] env[60788]: Faults: ['InvalidArgument'] [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Traceback (most recent call last): [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self.driver.spawn(context, instance, image_meta, [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self._fetch_image_if_missing(context, vi) [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] image_cache(vi, tmp_image_ds_loc) [ 1364.581289] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] vm_util.copy_virtual_disk( [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] session._wait_for_task(vmdk_copy_task) [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] return self.wait_for_task(task_ref) [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] return evt.wait() [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] result = hub.switch() [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] return self.greenlet.switch() [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1364.582358] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] self.f(*self.args, **self.kw) [ 1364.582982] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1364.582982] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] raise exceptions.translate_fault(task_info.error) [ 1364.582982] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1364.582982] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Faults: ['InvalidArgument'] [ 1364.582982] env[60788]: ERROR nova.compute.manager [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] [ 1364.582982] env[60788]: DEBUG nova.compute.utils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1364.583758] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Build of instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 was re-scheduled: A specified parameter was not correct: fileType [ 1364.583758] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1364.584157] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1364.584511] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1364.584511] env[60788]: DEBUG nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1364.584724] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1364.904691] env[60788]: DEBUG oslo_vmware.api [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Task: {'id': task-2205233, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068192} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1364.906354] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1364.906354] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1364.906354] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1364.906354] env[60788]: INFO nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1364.907690] env[60788]: DEBUG nova.compute.claims [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1364.907858] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1364.908087] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.010915] env[60788]: DEBUG nova.network.neutron [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1365.025317] env[60788]: INFO nova.compute.manager [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Took 0.44 seconds to deallocate network for instance. [ 1365.115798] env[60788]: INFO nova.scheduler.client.report [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Deleted allocations for instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 [ 1365.139171] env[60788]: DEBUG oslo_concurrency.lockutils [None req-d04e3563-18dc-44a6-bb60-0966f3e68328 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.621s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.140442] env[60788]: DEBUG oslo_concurrency.lockutils [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 485.544s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.140537] env[60788]: DEBUG oslo_concurrency.lockutils [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Acquiring lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.141422] env[60788]: DEBUG oslo_concurrency.lockutils [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.141422] env[60788]: DEBUG oslo_concurrency.lockutils [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.142745] env[60788]: INFO nova.compute.manager [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Terminating instance [ 1365.144352] env[60788]: DEBUG nova.compute.manager [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1365.144545] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1365.144991] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-809b2ffb-9d3f-4dc8-a1e6-9b9d7dcf87f1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.158721] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e161d7ce-6500-4e68-97b3-e752bba4dcad {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.171610] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1365.191774] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83 could not be found. [ 1365.192918] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1365.192918] env[60788]: INFO nova.compute.manager [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1365.192918] env[60788]: DEBUG oslo.service.loopingcall [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1365.194779] env[60788]: DEBUG nova.compute.manager [-] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1365.194884] env[60788]: DEBUG nova.network.neutron [-] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1365.206909] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e2d2c8-228b-460d-91a2-31c5b6f3a365 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.221148] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c340607-b884-4787-ba46-40b007d0ed56 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.224921] env[60788]: DEBUG nova.network.neutron [-] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1365.226818] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.255085] env[60788]: INFO nova.compute.manager [-] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] Took 0.06 seconds to deallocate network for instance. [ 1365.255802] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f66e86bf-0cd1-4a1e-ad61-7c4872087458 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.267332] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e9b8c96-05af-4689-a09f-4429c8f82005 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.281843] env[60788]: DEBUG nova.compute.provider_tree [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.291800] env[60788]: DEBUG nova.scheduler.client.report [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1365.304651] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.396s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.305358] env[60788]: ERROR nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = getattr(controller, method)(*args, **kwargs) [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._get(image_id) [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.305358] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] resp, body = self.http_client.get(url, headers=header) [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.request(url, 'GET', **kwargs) [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._handle_response(resp) [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exc.from_response(resp, resp.content) [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.305769] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self.driver.spawn(context, instance, image_meta, [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._fetch_image_if_missing(context, vi) [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image_fetch(context, vi, tmp_image_ds_loc) [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] images.fetch_image( [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] metadata = IMAGE_API.get(context, image_ref) [ 1365.306589] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return session.show(context, image_id, [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] _reraise_translated_image_exception(image_id) [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise new_exc.with_traceback(exc_trace) [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = getattr(controller, method)(*args, **kwargs) [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._get(image_id) [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.307344] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] resp, body = self.http_client.get(url, headers=header) [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.request(url, 'GET', **kwargs) [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._handle_response(resp) [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exc.from_response(resp, resp.content) [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] nova.exception.ImageNotAuthorized: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. [ 1365.307839] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.308231] env[60788]: DEBUG nova.compute.utils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1365.308231] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.080s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.308684] env[60788]: INFO nova.compute.claims [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1365.311465] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Build of instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e was re-scheduled: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1365.311730] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1365.311896] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1365.312073] env[60788]: DEBUG nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1365.312238] env[60788]: DEBUG nova.network.neutron [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1365.347411] env[60788]: DEBUG oslo_concurrency.lockutils [None req-471d992a-341a-4918-be83-c74d98abdec3 tempest-InstanceActionsTestJSON-2067428368 tempest-InstanceActionsTestJSON-2067428368-project-member] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.207s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.350126] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 289.598s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.350275] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83] During sync_power_state the instance has a pending task (deleting). Skip. [ 1365.350455] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "af9c7ab4-c6ac-4f5f-b016-b937ee1d5f83" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.424075] env[60788]: DEBUG neutronclient.v2_0.client [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60788) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1365.426354] env[60788]: ERROR nova.compute.manager [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = getattr(controller, method)(*args, **kwargs) [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._get(image_id) [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.426354] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] resp, body = self.http_client.get(url, headers=header) [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.request(url, 'GET', **kwargs) [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._handle_response(resp) [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exc.from_response(resp, resp.content) [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.426876] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self.driver.spawn(context, instance, image_meta, [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._fetch_image_if_missing(context, vi) [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image_fetch(context, vi, tmp_image_ds_loc) [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] images.fetch_image( [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] metadata = IMAGE_API.get(context, image_ref) [ 1365.427349] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return session.show(context, image_id, [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] _reraise_translated_image_exception(image_id) [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise new_exc.with_traceback(exc_trace) [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = getattr(controller, method)(*args, **kwargs) [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._get(image_id) [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1365.427898] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] resp, body = self.http_client.get(url, headers=header) [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.request(url, 'GET', **kwargs) [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self._handle_response(resp) [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exc.from_response(resp, resp.content) [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] nova.exception.ImageNotAuthorized: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.428201] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2431, in _do_build_and_run_instance [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._build_and_run_instance(context, instance, image, [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2723, in _build_and_run_instance [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exception.RescheduledException( [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] nova.exception.RescheduledException: Build of instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e was re-scheduled: Not authorized for image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21. [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.428488] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] exception_handler_v20(status_code, error_body) [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise client_exc(message=error_message, [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Neutron server returns request_ids: ['req-f2c1345a-c72e-4bfe-a304-bfcbc52f1048'] [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3020, in _cleanup_allocated_networks [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._deallocate_network(context, instance, requested_networks) [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self.network_api.deallocate_for_instance( [ 1365.428788] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] data = neutron.list_ports(**search_opts) [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.list('ports', self.ports_path, retrieve_all, [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] for r in self._pagination(collection, path, **params): [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] res = self.get(path, params=params) [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.429184] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.retry_request("GET", action, body=body, [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.do_request(method, action, body=body, [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._handle_fault_response(status_code, replybody, resp) [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exception.Unauthorized() [ 1365.429491] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] nova.exception.Unauthorized: Not authorized. [ 1365.429785] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.476895] env[60788]: INFO nova.scheduler.client.report [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Deleted allocations for instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e [ 1365.494357] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0c7c4214-442a-4781-b2ec-379fe75422a6 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 641.143s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.498059] env[60788]: DEBUG oslo_concurrency.lockutils [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 444.986s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.498315] env[60788]: DEBUG oslo_concurrency.lockutils [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Acquiring lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.498530] env[60788]: DEBUG oslo_concurrency.lockutils [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.498693] env[60788]: DEBUG oslo_concurrency.lockutils [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.500392] env[60788]: INFO nova.compute.manager [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Terminating instance [ 1365.502013] env[60788]: DEBUG nova.compute.manager [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1365.502230] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1365.502853] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-133d048b-159c-4da4-b1bc-6b643f82f3fc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.508170] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1365.517648] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d953368-c14c-41ed-951e-585c2d180ba3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.545628] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5c7c0b6d-d4ea-4c78-8a76-934859d6571e could not be found. [ 1365.545851] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1365.546045] env[60788]: INFO nova.compute.manager [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1365.546296] env[60788]: DEBUG oslo.service.loopingcall [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1365.552940] env[60788]: DEBUG nova.compute.manager [-] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1365.553058] env[60788]: DEBUG nova.network.neutron [-] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1365.569348] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1365.589700] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60b25b91-1dcf-44b2-ae35-25429dcd0fc4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.596850] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3389dab1-16fe-48fa-94e0-31d7cfa8aa60 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.631326] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6230a40e-1ac9-4bfd-aa0c-1f140a5f87f1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.637251] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fe0e405-5f2d-4775-98f8-039c50890433 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.651521] env[60788]: DEBUG nova.compute.provider_tree [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1365.659784] env[60788]: DEBUG nova.scheduler.client.report [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1365.676069] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.369s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.676580] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1365.679659] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.110s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.681234] env[60788]: INFO nova.compute.claims [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1365.712064] env[60788]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60788) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1365.712411] env[60788]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-d5d18640-6197-4c91-ae8a-0f00dc5394cc'] [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1365.713278] env[60788]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.713694] env[60788]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.714269] env[60788]: ERROR oslo.service.loopingcall [ 1365.714764] env[60788]: ERROR nova.compute.manager [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.718709] env[60788]: DEBUG nova.compute.utils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1365.720302] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1365.720510] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1365.735792] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1365.754023] env[60788]: ERROR nova.compute.manager [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] exception_handler_v20(status_code, error_body) [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise client_exc(message=error_message, [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Neutron server returns request_ids: ['req-d5d18640-6197-4c91-ae8a-0f00dc5394cc'] [ 1365.754023] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During handling of the above exception, another exception occurred: [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Traceback (most recent call last): [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._delete_instance(context, instance, bdms) [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._shutdown_instance(context, instance, bdms) [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._try_deallocate_network(context, instance, requested_networks) [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] with excutils.save_and_reraise_exception(): [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.754466] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self.force_reraise() [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise self.value [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] _deallocate_network_with_retries() [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return evt.wait() [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = hub.switch() [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.greenlet.switch() [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1365.754830] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = func(*self.args, **self.kw) [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] result = f(*args, **kwargs) [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._deallocate_network( [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self.network_api.deallocate_for_instance( [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] data = neutron.list_ports(**search_opts) [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.list('ports', self.ports_path, retrieve_all, [ 1365.755240] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] for r in self._pagination(collection, path, **params): [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] res = self.get(path, params=params) [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.retry_request("GET", action, body=body, [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.755599] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] return self.do_request(method, action, body=body, [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] ret = obj(*args, **kwargs) [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] self._handle_fault_response(status_code, replybody, resp) [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.756009] env[60788]: ERROR nova.compute.manager [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] [ 1365.804606] env[60788]: DEBUG oslo_concurrency.lockutils [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.307s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.805754] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 290.053s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1365.805941] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] During sync_power_state the instance has a pending task (deleting). Skip. [ 1365.806545] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "5c7c0b6d-d4ea-4c78-8a76-934859d6571e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1365.835895] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1365.839911] env[60788]: DEBUG nova.policy [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eaa5adb0f5664ae29f963281a27bee87', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '806b14f20c24458c93ed146a412a72db', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1365.867998] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1365.868265] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1365.868425] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1365.868718] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1365.868792] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1365.868893] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1365.869215] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1365.869462] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1365.869557] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1365.869750] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1365.869894] env[60788]: DEBUG nova.virt.hardware [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1365.870840] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820cec37-f829-4a5e-8b07-056ec0b95d7a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.877664] env[60788]: INFO nova.compute.manager [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] [instance: 5c7c0b6d-d4ea-4c78-8a76-934859d6571e] Successfully reverted task state from None on failure for instance. [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server [None req-636dc1c6-5ba8-4415-ada7-1ec784261823 tempest-DeleteServersAdminTestJSON-322675366 tempest-DeleteServersAdminTestJSON-322675366-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-d5d18640-6197-4c91-ae8a-0f00dc5394cc'] [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1365.883337] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.883798] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1365.884270] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1365.884710] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1365.885222] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1365.885667] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1365.886187] env[60788]: ERROR oslo_messaging.rpc.server [ 1365.889622] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9848c821-bdb4-4cef-a303-a8d209fb0e40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.982625] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "58bbe972-5fc1-4627-90e4-91251e047e86" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.004693] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75e6676b-4f74-4d42-a7dd-b4aca7237a54 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.012521] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddfb14ac-4ee4-4a27-9854-e2c85c294794 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.042064] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7eb6b528-db92-4330-ac0b-ea918e8ebeff {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.049969] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d25c9fe4-bc10-48c9-b59d-fc8720f75c2e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.063173] env[60788]: DEBUG nova.compute.provider_tree [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1366.072536] env[60788]: DEBUG nova.scheduler.client.report [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1366.088629] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.409s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.089193] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1366.127454] env[60788]: DEBUG nova.compute.utils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1366.129241] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1366.129440] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1366.140213] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1366.210168] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1366.231049] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1366.231294] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1366.231506] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1366.231714] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1366.231858] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1366.232008] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1366.232226] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1366.232384] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1366.232575] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1366.232759] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1366.232934] env[60788]: DEBUG nova.virt.hardware [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1366.233774] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f87e528-efa2-4bde-8828-04122ebaa271 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.242246] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9473ffb-30a2-43e7-8808-d7d178bfcf01 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.311503] env[60788]: DEBUG nova.policy [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '27ee7a4e0eed421aa4713c0db3b7de0a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7bb212f32eb0400888d4daf5ef998d1e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1366.619960] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Successfully created port: def35d13-443b-499d-afc2-c588808b622e {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1366.991326] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Successfully created port: c401be00-7dc6-4e1e-98d3-c0c192c337dc {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1367.741997] env[60788]: DEBUG nova.compute.manager [req-695a4854-1134-4cd7-b254-1798a757f296 req-f3d0e37e-b35f-475e-b003-bd5b65236447 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Received event network-vif-plugged-def35d13-443b-499d-afc2-c588808b622e {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1367.742265] env[60788]: DEBUG oslo_concurrency.lockutils [req-695a4854-1134-4cd7-b254-1798a757f296 req-f3d0e37e-b35f-475e-b003-bd5b65236447 service nova] Acquiring lock "58bbe972-5fc1-4627-90e4-91251e047e86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1367.742448] env[60788]: DEBUG oslo_concurrency.lockutils [req-695a4854-1134-4cd7-b254-1798a757f296 req-f3d0e37e-b35f-475e-b003-bd5b65236447 service nova] Lock "58bbe972-5fc1-4627-90e4-91251e047e86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.742620] env[60788]: DEBUG oslo_concurrency.lockutils [req-695a4854-1134-4cd7-b254-1798a757f296 req-f3d0e37e-b35f-475e-b003-bd5b65236447 service nova] Lock "58bbe972-5fc1-4627-90e4-91251e047e86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.742785] env[60788]: DEBUG nova.compute.manager [req-695a4854-1134-4cd7-b254-1798a757f296 req-f3d0e37e-b35f-475e-b003-bd5b65236447 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] No waiting events found dispatching network-vif-plugged-def35d13-443b-499d-afc2-c588808b622e {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1367.742946] env[60788]: WARNING nova.compute.manager [req-695a4854-1134-4cd7-b254-1798a757f296 req-f3d0e37e-b35f-475e-b003-bd5b65236447 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Received unexpected event network-vif-plugged-def35d13-443b-499d-afc2-c588808b622e for instance with vm_state building and task_state deleting. [ 1368.147543] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Successfully updated port: def35d13-443b-499d-afc2-c588808b622e {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1368.160163] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.160327] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquired lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.160484] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1368.255911] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1368.589036] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Successfully updated port: c401be00-7dc6-4e1e-98d3-c0c192c337dc {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1368.598752] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "refresh_cache-fb532f8b-5323-4f7a-be64-c6076a1862ae" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.598837] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquired lock "refresh_cache-fb532f8b-5323-4f7a-be64-c6076a1862ae" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.598987] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1368.664605] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Updating instance_info_cache with network_info: [{"id": "def35d13-443b-499d-afc2-c588808b622e", "address": "fa:16:3e:11:d1:d6", "network": {"id": "6e760255-04eb-4a73-b6a1-2d1e18fbc7fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-701168580-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "806b14f20c24458c93ed146a412a72db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0d2101e-2d93-4310-a242-af2d9ecdaf9b", "external-id": "nsx-vlan-transportzone-121", "segmentation_id": 121, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdef35d13-44", "ovs_interfaceid": "def35d13-443b-499d-afc2-c588808b622e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1368.677713] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Releasing lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1368.678076] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance network_info: |[{"id": "def35d13-443b-499d-afc2-c588808b622e", "address": "fa:16:3e:11:d1:d6", "network": {"id": "6e760255-04eb-4a73-b6a1-2d1e18fbc7fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-701168580-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "806b14f20c24458c93ed146a412a72db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0d2101e-2d93-4310-a242-af2d9ecdaf9b", "external-id": "nsx-vlan-transportzone-121", "segmentation_id": 121, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdef35d13-44", "ovs_interfaceid": "def35d13-443b-499d-afc2-c588808b622e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1368.678681] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:11:d1:d6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a0d2101e-2d93-4310-a242-af2d9ecdaf9b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'def35d13-443b-499d-afc2-c588808b622e', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1368.686554] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Creating folder: Project (806b14f20c24458c93ed146a412a72db). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1368.687036] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-91b7c449-9ff2-4ad8-9968-7752a48109ad {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.696165] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1368.701833] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Created folder: Project (806b14f20c24458c93ed146a412a72db) in parent group-v449747. [ 1368.702048] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Creating folder: Instances. Parent ref: group-v449824. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1368.702312] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a75242ba-2226-4892-91c2-e065403bf877 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.711611] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Created folder: Instances in parent group-v449824. [ 1368.711843] env[60788]: DEBUG oslo.service.loopingcall [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1368.712046] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1368.712259] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e077ce54-239a-4206-b7ad-b3183c2da6c4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.732502] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1368.732502] env[60788]: value = "task-2205236" [ 1368.732502] env[60788]: _type = "Task" [ 1368.732502] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.739915] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205236, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1368.922774] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Updating instance_info_cache with network_info: [{"id": "c401be00-7dc6-4e1e-98d3-c0c192c337dc", "address": "fa:16:3e:2c:a7:11", "network": {"id": "538dcb21-4869-42a4-bd64-81033b953683", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-341799346-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7bb212f32eb0400888d4daf5ef998d1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "304be4f7-4e36-4468-9ef4-e457341cef18", "external-id": "nsx-vlan-transportzone-911", "segmentation_id": 911, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc401be00-7d", "ovs_interfaceid": "c401be00-7dc6-4e1e-98d3-c0c192c337dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1368.939696] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Releasing lock "refresh_cache-fb532f8b-5323-4f7a-be64-c6076a1862ae" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1368.940031] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance network_info: |[{"id": "c401be00-7dc6-4e1e-98d3-c0c192c337dc", "address": "fa:16:3e:2c:a7:11", "network": {"id": "538dcb21-4869-42a4-bd64-81033b953683", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-341799346-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7bb212f32eb0400888d4daf5ef998d1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "304be4f7-4e36-4468-9ef4-e457341cef18", "external-id": "nsx-vlan-transportzone-911", "segmentation_id": 911, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc401be00-7d", "ovs_interfaceid": "c401be00-7dc6-4e1e-98d3-c0c192c337dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1368.940723] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:a7:11', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '304be4f7-4e36-4468-9ef4-e457341cef18', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c401be00-7dc6-4e1e-98d3-c0c192c337dc', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1368.949583] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Creating folder: Project (7bb212f32eb0400888d4daf5ef998d1e). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1368.950149] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fc69a36e-a5df-4947-b8e8-e1213c60a0f6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.961238] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Created folder: Project (7bb212f32eb0400888d4daf5ef998d1e) in parent group-v449747. [ 1368.961423] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Creating folder: Instances. Parent ref: group-v449827. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1368.961648] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce1fcbd5-9204-42cc-bb60-fec40b26ddf0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.969766] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Created folder: Instances in parent group-v449827. [ 1368.969997] env[60788]: DEBUG oslo.service.loopingcall [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1368.970256] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1368.970459] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f05cbee8-7cf4-4852-aa58-ac628dd12dc5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.989109] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1368.989109] env[60788]: value = "task-2205239" [ 1368.989109] env[60788]: _type = "Task" [ 1368.989109] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.999468] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205239, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.242719] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205236, 'name': CreateVM_Task, 'duration_secs': 0.311879} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1369.242945] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1369.243575] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.243745] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.244097] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1369.244357] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-79efe37d-3cf6-4c32-bd84-0348d4cc976a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.248775] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Waiting for the task: (returnval){ [ 1369.248775] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52728f88-c621-c8da-246f-ba8df01c5b45" [ 1369.248775] env[60788]: _type = "Task" [ 1369.248775] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.258071] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52728f88-c621-c8da-246f-ba8df01c5b45, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.498789] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205239, 'name': CreateVM_Task, 'duration_secs': 0.376957} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1369.499041] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1369.499885] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.758914] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1369.759272] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1369.759509] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.759730] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.760043] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1369.760328] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-21873411-7f80-4ffd-9429-d1817e66ed16 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.765060] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Waiting for the task: (returnval){ [ 1369.765060] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]523b63a8-06f1-4e70-cedd-35f43a7eb64e" [ 1369.765060] env[60788]: _type = "Task" [ 1369.765060] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.772560] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]523b63a8-06f1-4e70-cedd-35f43a7eb64e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.776462] env[60788]: DEBUG nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Received event network-changed-def35d13-443b-499d-afc2-c588808b622e {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1369.776652] env[60788]: DEBUG nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Refreshing instance network info cache due to event network-changed-def35d13-443b-499d-afc2-c588808b622e. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1369.776855] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Acquiring lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.777007] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Acquired lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.777208] env[60788]: DEBUG nova.network.neutron [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Refreshing network info cache for port def35d13-443b-499d-afc2-c588808b622e {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1370.131767] env[60788]: DEBUG nova.network.neutron [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Updated VIF entry in instance network info cache for port def35d13-443b-499d-afc2-c588808b622e. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1370.132161] env[60788]: DEBUG nova.network.neutron [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Updating instance_info_cache with network_info: [{"id": "def35d13-443b-499d-afc2-c588808b622e", "address": "fa:16:3e:11:d1:d6", "network": {"id": "6e760255-04eb-4a73-b6a1-2d1e18fbc7fe", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-701168580-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "806b14f20c24458c93ed146a412a72db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0d2101e-2d93-4310-a242-af2d9ecdaf9b", "external-id": "nsx-vlan-transportzone-121", "segmentation_id": 121, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdef35d13-44", "ovs_interfaceid": "def35d13-443b-499d-afc2-c588808b622e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1370.141190] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Releasing lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1370.141427] env[60788]: DEBUG nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Received event network-vif-plugged-c401be00-7dc6-4e1e-98d3-c0c192c337dc {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1370.141619] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Acquiring lock "fb532f8b-5323-4f7a-be64-c6076a1862ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1370.141815] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1370.141977] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1370.142154] env[60788]: DEBUG nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] No waiting events found dispatching network-vif-plugged-c401be00-7dc6-4e1e-98d3-c0c192c337dc {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1370.142321] env[60788]: WARNING nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Received unexpected event network-vif-plugged-c401be00-7dc6-4e1e-98d3-c0c192c337dc for instance with vm_state building and task_state spawning. [ 1370.142488] env[60788]: DEBUG nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Received event network-changed-c401be00-7dc6-4e1e-98d3-c0c192c337dc {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1370.142637] env[60788]: DEBUG nova.compute.manager [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Refreshing instance network info cache due to event network-changed-c401be00-7dc6-4e1e-98d3-c0c192c337dc. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1370.142817] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Acquiring lock "refresh_cache-fb532f8b-5323-4f7a-be64-c6076a1862ae" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.142956] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Acquired lock "refresh_cache-fb532f8b-5323-4f7a-be64-c6076a1862ae" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1370.143132] env[60788]: DEBUG nova.network.neutron [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Refreshing network info cache for port c401be00-7dc6-4e1e-98d3-c0c192c337dc {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1370.274548] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1370.274823] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1370.275044] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.393223] env[60788]: DEBUG nova.network.neutron [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Updated VIF entry in instance network info cache for port c401be00-7dc6-4e1e-98d3-c0c192c337dc. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1370.393604] env[60788]: DEBUG nova.network.neutron [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Updating instance_info_cache with network_info: [{"id": "c401be00-7dc6-4e1e-98d3-c0c192c337dc", "address": "fa:16:3e:2c:a7:11", "network": {"id": "538dcb21-4869-42a4-bd64-81033b953683", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-341799346-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7bb212f32eb0400888d4daf5ef998d1e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "304be4f7-4e36-4468-9ef4-e457341cef18", "external-id": "nsx-vlan-transportzone-911", "segmentation_id": 911, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc401be00-7d", "ovs_interfaceid": "c401be00-7dc6-4e1e-98d3-c0c192c337dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1370.403808] env[60788]: DEBUG oslo_concurrency.lockutils [req-684c4ec2-390f-47a6-bd07-85459069d56d req-735e3ff0-5e1f-4319-9165-e22d4b2db935 service nova] Releasing lock "refresh_cache-fb532f8b-5323-4f7a-be64-c6076a1862ae" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1371.013106] env[60788]: DEBUG oslo_concurrency.lockutils [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1377.754520] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1379.761821] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1380.209881] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1380.210122] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1381.754133] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1381.754133] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1381.754133] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1382.753829] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1382.754106] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1382.754257] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances with incomplete migration {{(pid=60788) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1383.760425] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1383.760791] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1383.772509] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.772730] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.772898] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1383.773070] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1383.774186] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c72fbe52-5088-4425-9689-a5ab95dd2c0f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.783347] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f46d926c-17d1-41ae-bf68-11e5d49144d6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.797776] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ece13432-a9bd-494e-bed6-a2e357697733 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.804326] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45bcb662-3b29-4892-b657-66086f8aac48 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.837186] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181223MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1383.837424] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.837776] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.943078] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c206be99-2f74-4c28-a008-e6edcccf65bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943260] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943399] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943522] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943645] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943766] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943883] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.943998] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.944132] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.944245] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1383.957306] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 778f4021-05ef-4904-864e-769e035df239 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.969420] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 1793989e-b036-47d0-a036-5960936e145a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.981174] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1383.992459] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1384.003058] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1384.003311] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1384.003463] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1384.019890] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing inventories for resource provider 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1384.036673] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating ProviderTree inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1384.036902] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1384.048722] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing aggregate associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, aggregates: None {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1384.067148] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing trait associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1384.245960] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ac1680-0e29-47b6-a2da-7336a924eb16 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.254136] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa070dd-357d-4320-9ae8-944e2152ec15 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.285104] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ad48cae-6588-432a-8e7e-0d1bee6eab03 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.292556] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fb27f1a-250f-433d-9739-7b1f72efebe5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.306873] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1384.315142] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1384.328824] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1384.328824] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.491s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1385.321669] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1385.321937] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1385.321990] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1385.347681] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.347869] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348015] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348150] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348273] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348395] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348513] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348631] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348749] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.348885] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1385.349077] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1385.349643] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1387.753887] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1387.754284] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1387.754337] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1387.763470] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] There are 0 instances to clean {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 1400.721207] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1400.721580] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1413.986843] env[60788]: WARNING oslo_vmware.rw_handles [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1413.986843] env[60788]: ERROR oslo_vmware.rw_handles [ 1413.987709] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1413.989333] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1413.989617] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Copying Virtual Disk [datastore2] vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/f21b71df-ca27-424b-8722-62fa26772ed9/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1413.990010] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5d9754df-002a-4bd5-ba55-1cd040628b55 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1413.999089] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Waiting for the task: (returnval){ [ 1413.999089] env[60788]: value = "task-2205240" [ 1413.999089] env[60788]: _type = "Task" [ 1413.999089] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.007103] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Task: {'id': task-2205240, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.510134] env[60788]: DEBUG oslo_vmware.exceptions [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1414.510412] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1414.510983] env[60788]: ERROR nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1414.510983] env[60788]: Faults: ['InvalidArgument'] [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Traceback (most recent call last): [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] yield resources [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self.driver.spawn(context, instance, image_meta, [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self._fetch_image_if_missing(context, vi) [ 1414.510983] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] image_cache(vi, tmp_image_ds_loc) [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] vm_util.copy_virtual_disk( [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] session._wait_for_task(vmdk_copy_task) [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] return self.wait_for_task(task_ref) [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] return evt.wait() [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] result = hub.switch() [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1414.511472] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] return self.greenlet.switch() [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self.f(*self.args, **self.kw) [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] raise exceptions.translate_fault(task_info.error) [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Faults: ['InvalidArgument'] [ 1414.511867] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] [ 1414.511867] env[60788]: INFO nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Terminating instance [ 1414.512846] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1414.513057] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1414.513289] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d0aeca5-1d3f-43db-9a77-058e94296345 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.515325] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1414.515527] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1414.516231] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ce9414-3344-43ce-bb1a-cdbc34d5cc96 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.522790] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1414.522986] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1cd93d4c-22ae-48db-a0b8-32d77b3c46cb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.525036] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1414.525214] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1414.526109] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91640078-61ec-46ac-b67b-16a4ddc4b033 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.530769] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for the task: (returnval){ [ 1414.530769] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]521bf4a7-dc86-c713-1599-f93e1b506a98" [ 1414.530769] env[60788]: _type = "Task" [ 1414.530769] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.537820] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]521bf4a7-dc86-c713-1599-f93e1b506a98, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1414.587158] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1414.587375] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1414.587553] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Deleting the datastore file [datastore2] c206be99-2f74-4c28-a008-e6edcccf65bf {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1414.587802] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cbb13bd9-25c8-427f-a9bc-a298ba587a51 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1414.593812] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Waiting for the task: (returnval){ [ 1414.593812] env[60788]: value = "task-2205242" [ 1414.593812] env[60788]: _type = "Task" [ 1414.593812] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1414.600967] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Task: {'id': task-2205242, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.040966] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1415.041265] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Creating directory with path [datastore2] vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1415.041495] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bfb3226c-7f96-49a7-9271-33bb81a9d5dc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.053586] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Created directory with path [datastore2] vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1415.053771] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Fetch image to [datastore2] vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1415.053938] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1415.054653] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae5cd14-af99-43a1-b64a-02ac30120ee5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.060916] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9e9347e-e157-4da9-932c-6acba71c5917 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.069659] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b140c870-7ae5-401a-b7b4-16dd27f14193 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.101590] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9123aac6-cf6b-4f4a-ad12-18ed395ec374 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.109630] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2878f553-6d43-49a6-903c-7d324e6b64b0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.111265] env[60788]: DEBUG oslo_vmware.api [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Task: {'id': task-2205242, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079411} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1415.111485] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1415.111659] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1415.111822] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1415.111995] env[60788]: INFO nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1415.114319] env[60788]: DEBUG nova.compute.claims [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1415.114501] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1415.114718] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1415.131915] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1415.180151] env[60788]: DEBUG oslo_vmware.rw_handles [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1415.239490] env[60788]: DEBUG oslo_vmware.rw_handles [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1415.239677] env[60788]: DEBUG oslo_vmware.rw_handles [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1415.375223] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58bd5e98-e589-4fd4-b211-732c82f2f07b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.382926] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e96d16-34e8-4b65-8434-c35fa37dbf1b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.412911] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b556d76b-81d5-43b2-9340-47faaaa7edab {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.419839] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c25c4e-751e-49e8-bb6f-597f0df92f28 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.432931] env[60788]: DEBUG nova.compute.provider_tree [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1415.441639] env[60788]: DEBUG nova.scheduler.client.report [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1415.454751] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.340s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1415.455264] env[60788]: ERROR nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1415.455264] env[60788]: Faults: ['InvalidArgument'] [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Traceback (most recent call last): [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self.driver.spawn(context, instance, image_meta, [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self._fetch_image_if_missing(context, vi) [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] image_cache(vi, tmp_image_ds_loc) [ 1415.455264] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] vm_util.copy_virtual_disk( [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] session._wait_for_task(vmdk_copy_task) [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] return self.wait_for_task(task_ref) [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] return evt.wait() [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] result = hub.switch() [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] return self.greenlet.switch() [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1415.455625] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] self.f(*self.args, **self.kw) [ 1415.456097] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1415.456097] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] raise exceptions.translate_fault(task_info.error) [ 1415.456097] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1415.456097] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Faults: ['InvalidArgument'] [ 1415.456097] env[60788]: ERROR nova.compute.manager [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] [ 1415.456097] env[60788]: DEBUG nova.compute.utils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1415.457285] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Build of instance c206be99-2f74-4c28-a008-e6edcccf65bf was re-scheduled: A specified parameter was not correct: fileType [ 1415.457285] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1415.457662] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1415.457855] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1415.458050] env[60788]: DEBUG nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1415.458223] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1415.926551] env[60788]: DEBUG nova.network.neutron [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1415.948652] env[60788]: INFO nova.compute.manager [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Took 0.49 seconds to deallocate network for instance. [ 1416.051208] env[60788]: INFO nova.scheduler.client.report [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Deleted allocations for instance c206be99-2f74-4c28-a008-e6edcccf65bf [ 1416.074091] env[60788]: DEBUG oslo_concurrency.lockutils [None req-52d66a15-824d-4b24-ab65-25c5e55d20ff tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 640.288s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.075256] env[60788]: DEBUG oslo_concurrency.lockutils [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 443.499s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.075489] env[60788]: DEBUG oslo_concurrency.lockutils [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Acquiring lock "c206be99-2f74-4c28-a008-e6edcccf65bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.075706] env[60788]: DEBUG oslo_concurrency.lockutils [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.075877] env[60788]: DEBUG oslo_concurrency.lockutils [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.078079] env[60788]: INFO nova.compute.manager [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Terminating instance [ 1416.079760] env[60788]: DEBUG nova.compute.manager [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1416.080009] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1416.080632] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-aac6b49e-790b-44eb-9272-faf5b60f150c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.089943] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-937dac83-adde-4fb2-be3f-d6c646583f41 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.101297] env[60788]: DEBUG nova.compute.manager [None req-b5ce8980-0966-470a-b2f9-b538df6c637e tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: bbfb23ab-0f4d-4195-ad4f-12b405a28267] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1416.121946] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c206be99-2f74-4c28-a008-e6edcccf65bf could not be found. [ 1416.122794] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1416.122794] env[60788]: INFO nova.compute.manager [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1416.122794] env[60788]: DEBUG oslo.service.loopingcall [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1416.122941] env[60788]: DEBUG nova.compute.manager [-] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1416.123139] env[60788]: DEBUG nova.network.neutron [-] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1416.126172] env[60788]: DEBUG nova.compute.manager [None req-b5ce8980-0966-470a-b2f9-b538df6c637e tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: bbfb23ab-0f4d-4195-ad4f-12b405a28267] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1416.146146] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5ce8980-0966-470a-b2f9-b538df6c637e tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "bbfb23ab-0f4d-4195-ad4f-12b405a28267" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.718s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.159461] env[60788]: DEBUG nova.compute.manager [None req-963badbc-d090-4cb9-9b20-c632766e436c tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] [instance: 778f4021-05ef-4904-864e-769e035df239] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1416.164626] env[60788]: DEBUG nova.network.neutron [-] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1416.173744] env[60788]: INFO nova.compute.manager [-] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] Took 0.05 seconds to deallocate network for instance. [ 1416.189648] env[60788]: DEBUG nova.compute.manager [None req-963badbc-d090-4cb9-9b20-c632766e436c tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] [instance: 778f4021-05ef-4904-864e-769e035df239] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1416.213128] env[60788]: DEBUG oslo_concurrency.lockutils [None req-963badbc-d090-4cb9-9b20-c632766e436c tempest-ServersTestMultiNic-852289293 tempest-ServersTestMultiNic-852289293-project-member] Lock "778f4021-05ef-4904-864e-769e035df239" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.551s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.223900] env[60788]: DEBUG nova.compute.manager [None req-24c261cc-9f9c-4d34-8dc4-8b24ab7bc366 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 1793989e-b036-47d0-a036-5960936e145a] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1416.246704] env[60788]: DEBUG nova.compute.manager [None req-24c261cc-9f9c-4d34-8dc4-8b24ab7bc366 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 1793989e-b036-47d0-a036-5960936e145a] Instance disappeared before build. {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2414}} [ 1416.269764] env[60788]: DEBUG oslo_concurrency.lockutils [None req-24c261cc-9f9c-4d34-8dc4-8b24ab7bc366 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "1793989e-b036-47d0-a036-5960936e145a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.528s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.271383] env[60788]: DEBUG oslo_concurrency.lockutils [None req-649de16e-c14e-4348-baf5-16daf9889e39 tempest-ServersV294TestFqdnHostnames-1639279538 tempest-ServersV294TestFqdnHostnames-1639279538-project-member] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.196s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.272166] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 340.520s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.272407] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c206be99-2f74-4c28-a008-e6edcccf65bf] During sync_power_state the instance has a pending task (deleting). Skip. [ 1416.272605] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "c206be99-2f74-4c28-a008-e6edcccf65bf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.279074] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1416.327042] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.327042] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.327785] env[60788]: INFO nova.compute.claims [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1416.517677] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a57d866-2aed-4047-88f0-d41c014e6c4e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.525507] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c574c4-d8da-4a59-b2b2-1adfa34c5d15 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.556201] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b810f6c2-ed40-41d7-93fa-eec881131018 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.563008] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1358d365-d3b7-4820-ae98-ad22ef477d1b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.575785] env[60788]: DEBUG nova.compute.provider_tree [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1416.585062] env[60788]: DEBUG nova.scheduler.client.report [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1416.598872] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1416.599340] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1416.631336] env[60788]: DEBUG nova.compute.utils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1416.632891] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1416.633137] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1416.643541] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1416.692679] env[60788]: DEBUG nova.policy [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34d238f3928b4f40813646c9867375c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a80b1c30e829410c9a324f5a4af8c9f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1416.709652] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1416.738603] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1416.738849] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1416.739291] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1416.739291] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1416.739458] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1416.739522] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1416.739721] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1416.739882] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1416.740121] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1416.740360] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1416.740438] env[60788]: DEBUG nova.virt.hardware [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1416.741312] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0420f57-f561-4c36-a243-21d18d39eed0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.749565] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2c832b3-974d-4816-b9a9-677ffa172106 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1417.041697] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Successfully created port: 5756b302-b1c1-4f30-b892-2dca4cb73e4e {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1418.036301] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Successfully updated port: 5756b302-b1c1-4f30-b892-2dca4cb73e4e {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1418.051711] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "refresh_cache-f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1418.051865] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "refresh_cache-f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1418.052052] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1418.089307] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1418.103218] env[60788]: DEBUG nova.compute.manager [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Received event network-vif-plugged-5756b302-b1c1-4f30-b892-2dca4cb73e4e {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1418.103439] env[60788]: DEBUG oslo_concurrency.lockutils [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] Acquiring lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1418.103651] env[60788]: DEBUG oslo_concurrency.lockutils [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1418.103817] env[60788]: DEBUG oslo_concurrency.lockutils [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1418.103981] env[60788]: DEBUG nova.compute.manager [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] No waiting events found dispatching network-vif-plugged-5756b302-b1c1-4f30-b892-2dca4cb73e4e {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1418.104165] env[60788]: WARNING nova.compute.manager [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Received unexpected event network-vif-plugged-5756b302-b1c1-4f30-b892-2dca4cb73e4e for instance with vm_state building and task_state spawning. [ 1418.104325] env[60788]: DEBUG nova.compute.manager [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Received event network-changed-5756b302-b1c1-4f30-b892-2dca4cb73e4e {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1418.104477] env[60788]: DEBUG nova.compute.manager [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Refreshing instance network info cache due to event network-changed-5756b302-b1c1-4f30-b892-2dca4cb73e4e. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1418.104637] env[60788]: DEBUG oslo_concurrency.lockutils [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] Acquiring lock "refresh_cache-f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1418.364329] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Updating instance_info_cache with network_info: [{"id": "5756b302-b1c1-4f30-b892-2dca4cb73e4e", "address": "fa:16:3e:a0:ef:fb", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5756b302-b1", "ovs_interfaceid": "5756b302-b1c1-4f30-b892-2dca4cb73e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1418.376741] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "refresh_cache-f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1418.376741] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance network_info: |[{"id": "5756b302-b1c1-4f30-b892-2dca4cb73e4e", "address": "fa:16:3e:a0:ef:fb", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5756b302-b1", "ovs_interfaceid": "5756b302-b1c1-4f30-b892-2dca4cb73e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1418.376924] env[60788]: DEBUG oslo_concurrency.lockutils [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] Acquired lock "refresh_cache-f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1418.376924] env[60788]: DEBUG nova.network.neutron [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Refreshing network info cache for port 5756b302-b1c1-4f30-b892-2dca4cb73e4e {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1418.377696] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:ef:fb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a4b6ddb2-2e19-4031-9b22-add90d41a114', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5756b302-b1c1-4f30-b892-2dca4cb73e4e', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1418.386135] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating folder: Project (a80b1c30e829410c9a324f5a4af8c9f7). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1418.389201] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-298161f1-4990-45e2-890c-fd01eadc75bd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.400355] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Created folder: Project (a80b1c30e829410c9a324f5a4af8c9f7) in parent group-v449747. [ 1418.400355] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating folder: Instances. Parent ref: group-v449830. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1418.400355] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b8f8e58e-a8ff-4cca-96dd-cb4bd4a8485a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.407985] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Created folder: Instances in parent group-v449830. [ 1418.408245] env[60788]: DEBUG oslo.service.loopingcall [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1418.408426] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1418.408647] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-894e7cf8-acbf-4007-b0c6-f48a7586717b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.431190] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1418.431190] env[60788]: value = "task-2205245" [ 1418.431190] env[60788]: _type = "Task" [ 1418.431190] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1418.438711] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205245, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1418.708766] env[60788]: DEBUG nova.network.neutron [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Updated VIF entry in instance network info cache for port 5756b302-b1c1-4f30-b892-2dca4cb73e4e. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1418.709146] env[60788]: DEBUG nova.network.neutron [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Updating instance_info_cache with network_info: [{"id": "5756b302-b1c1-4f30-b892-2dca4cb73e4e", "address": "fa:16:3e:a0:ef:fb", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5756b302-b1", "ovs_interfaceid": "5756b302-b1c1-4f30-b892-2dca4cb73e4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1418.718433] env[60788]: DEBUG oslo_concurrency.lockutils [req-e4549df9-b5f5-45b8-b4e3-d88871266aca req-9770da02-2972-4d7b-a270-e72eca94bff5 service nova] Releasing lock "refresh_cache-f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1418.941283] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205245, 'name': CreateVM_Task, 'duration_secs': 0.275843} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1418.941406] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1418.942063] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1418.942235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1418.942583] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1418.942838] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90a6bb26-be80-4291-9cd2-c797a004f781 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.946939] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 1418.946939] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5210edb5-8e2a-296d-8fe7-a646439fd064" [ 1418.946939] env[60788]: _type = "Task" [ 1418.946939] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1418.953953] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5210edb5-8e2a-296d-8fe7-a646439fd064, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1419.457874] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1419.458215] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1419.458418] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.758634] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1440.753624] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1441.753437] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1442.753673] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1443.748601] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1443.753294] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1443.753482] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1445.755109] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1445.755109] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1445.755109] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1445.774368] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.774507] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.774638] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.774767] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.774893] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.775026] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.775154] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.775275] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.775394] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.775513] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1445.775630] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1445.776084] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1445.776271] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1445.788311] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1445.788521] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1445.788686] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1445.788836] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1445.789898] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ab9425b-9598-4352-9c39-79022e3f811e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.798757] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-120bd3a6-baf8-4306-811c-37d1128db138 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.812545] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72cb0ab5-91cb-4bab-85fa-b05411199145 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.818810] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf7e5f2e-ecfe-4958-9b2e-dfa3888b96cd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1445.848660] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181264MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1445.848813] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1445.848977] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1445.919559] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.919724] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.919880] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920018] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920146] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920267] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920388] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920532] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920656] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.920771] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1445.932161] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1445.943279] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1445.955091] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1445.955380] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1445.955555] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1446.105460] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4005c80f-8095-48fe-b523-fe0853bf181b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.114966] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17337394-1d7e-4d90-83f8-3241cc82d5b6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.143133] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82e8232f-c15e-499e-952a-9c5be780aa9c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.150029] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18780296-d208-4a70-a56d-912c59c5358e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1446.164498] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1446.172760] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1446.186375] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1446.186444] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1448.164419] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1464.004259] env[60788]: WARNING oslo_vmware.rw_handles [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1464.004259] env[60788]: ERROR oslo_vmware.rw_handles [ 1464.004880] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1464.007607] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1464.007971] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Copying Virtual Disk [datastore2] vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/ef5e6b45-1e5f-439b-ae91-4b9a9b76142a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1464.008385] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ce4702c7-fb81-4e4d-bac8-e1f6aa7b2ff0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.016642] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for the task: (returnval){ [ 1464.016642] env[60788]: value = "task-2205246" [ 1464.016642] env[60788]: _type = "Task" [ 1464.016642] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1464.024535] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Task: {'id': task-2205246, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1464.527264] env[60788]: DEBUG oslo_vmware.exceptions [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1464.527534] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1464.528169] env[60788]: ERROR nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1464.528169] env[60788]: Faults: ['InvalidArgument'] [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Traceback (most recent call last): [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] yield resources [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self.driver.spawn(context, instance, image_meta, [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._fetch_image_if_missing(context, vi) [ 1464.528169] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] image_cache(vi, tmp_image_ds_loc) [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] vm_util.copy_virtual_disk( [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] session._wait_for_task(vmdk_copy_task) [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.wait_for_task(task_ref) [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return evt.wait() [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] result = hub.switch() [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1464.528553] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.greenlet.switch() [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self.f(*self.args, **self.kw) [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] raise exceptions.translate_fault(task_info.error) [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Faults: ['InvalidArgument'] [ 1464.528995] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] [ 1464.528995] env[60788]: INFO nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Terminating instance [ 1464.530056] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1464.530268] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1464.530501] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ec916b16-cbca-4cc0-a0cf-fb91740c577c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.532512] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1464.532669] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquired lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1464.532832] env[60788]: DEBUG nova.network.neutron [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1464.538957] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1464.539141] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1464.539799] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-03f0ab3d-b6d7-4621-b203-6bf3b89ce18f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.546800] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Waiting for the task: (returnval){ [ 1464.546800] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52b623f9-2e58-e701-c715-6d270e5938de" [ 1464.546800] env[60788]: _type = "Task" [ 1464.546800] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1464.558631] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52b623f9-2e58-e701-c715-6d270e5938de, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1464.611333] env[60788]: DEBUG nova.network.neutron [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1464.757872] env[60788]: DEBUG nova.network.neutron [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1464.768187] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Releasing lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1464.768187] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1464.768187] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1464.768843] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b624e14-1c80-4a5e-90d2-b38efd94bcb2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.776826] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1464.777094] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1f7c031d-b233-464d-b944-07f6be103dbd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.809870] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1464.810072] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1464.810289] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Deleting the datastore file [datastore2] a9c14682-d6d7-43a0-b489-bd3f01a5cc17 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1464.810563] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cbcc88a6-3a1a-4926-9741-7951e956adf8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.816399] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for the task: (returnval){ [ 1464.816399] env[60788]: value = "task-2205248" [ 1464.816399] env[60788]: _type = "Task" [ 1464.816399] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1464.823981] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Task: {'id': task-2205248, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1465.057463] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1465.057717] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Creating directory with path [datastore2] vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1465.057963] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05a33c12-d9b3-4156-8c1d-6dd0ee12ec6c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.069367] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Created directory with path [datastore2] vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1465.069567] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Fetch image to [datastore2] vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1465.069735] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1465.070488] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ff6227d-fd41-4762-92c0-eea0f4964e01 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.079038] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a227c8b0-96d7-4c74-9839-02ff7576e248 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.089443] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eae8cb81-2ef8-454f-8740-39d2ad7fa3b4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.119612] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dbe30cb-c3ff-4c4f-9982-65131f9bf253 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.125150] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f995196-8ae6-475e-a074-d3897af4a163 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.145898] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1465.195168] env[60788]: DEBUG oslo_vmware.rw_handles [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1465.256963] env[60788]: DEBUG oslo_vmware.rw_handles [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1465.257178] env[60788]: DEBUG oslo_vmware.rw_handles [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1465.326173] env[60788]: DEBUG oslo_vmware.api [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Task: {'id': task-2205248, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044317} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1465.326404] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1465.326591] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1465.326763] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1465.326935] env[60788]: INFO nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1465.327184] env[60788]: DEBUG oslo.service.loopingcall [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1465.327388] env[60788]: DEBUG nova.compute.manager [-] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1465.329447] env[60788]: DEBUG nova.compute.claims [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1465.329618] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1465.329829] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.525634] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802afcbe-1a4a-4c84-a64e-3277c734bdb3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.532663] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de55810c-e07d-4b6f-8b53-0651fe3b4c1f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.562726] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3910c397-f637-49c5-8b8f-117ec88dbe83 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.569616] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-071c402e-358b-4b87-87ad-d383038b2a7e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.582264] env[60788]: DEBUG nova.compute.provider_tree [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1465.590580] env[60788]: DEBUG nova.scheduler.client.report [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1465.605590] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.276s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.606134] env[60788]: ERROR nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1465.606134] env[60788]: Faults: ['InvalidArgument'] [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Traceback (most recent call last): [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self.driver.spawn(context, instance, image_meta, [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._fetch_image_if_missing(context, vi) [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] image_cache(vi, tmp_image_ds_loc) [ 1465.606134] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] vm_util.copy_virtual_disk( [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] session._wait_for_task(vmdk_copy_task) [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.wait_for_task(task_ref) [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return evt.wait() [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] result = hub.switch() [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.greenlet.switch() [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1465.606518] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self.f(*self.args, **self.kw) [ 1465.606892] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1465.606892] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] raise exceptions.translate_fault(task_info.error) [ 1465.606892] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1465.606892] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Faults: ['InvalidArgument'] [ 1465.606892] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] [ 1465.606892] env[60788]: DEBUG nova.compute.utils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1465.608155] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Build of instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 was re-scheduled: A specified parameter was not correct: fileType [ 1465.608155] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1465.608532] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1465.608757] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1465.608906] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquired lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1465.609079] env[60788]: DEBUG nova.network.neutron [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1465.632123] env[60788]: DEBUG nova.network.neutron [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1465.690585] env[60788]: DEBUG nova.network.neutron [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1465.698889] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Releasing lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1465.699172] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1465.699358] env[60788]: DEBUG nova.compute.manager [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1465.790235] env[60788]: INFO nova.scheduler.client.report [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Deleted allocations for instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 [ 1465.809718] env[60788]: DEBUG oslo_concurrency.lockutils [None req-edd50038-fae1-4ec7-9b73-40ae94679ce6 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 642.243s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.810852] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 445.715s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.811273] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1465.811352] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.811491] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.813523] env[60788]: INFO nova.compute.manager [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Terminating instance [ 1465.814933] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquiring lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1465.815105] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Acquired lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1465.815268] env[60788]: DEBUG nova.network.neutron [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1465.823806] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1465.842031] env[60788]: DEBUG nova.network.neutron [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1465.875364] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1465.875621] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.877143] env[60788]: INFO nova.compute.claims [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1465.914668] env[60788]: DEBUG nova.network.neutron [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1465.922695] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Releasing lock "refresh_cache-a9c14682-d6d7-43a0-b489-bd3f01a5cc17" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1465.923109] env[60788]: DEBUG nova.compute.manager [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1465.923306] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1465.926339] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-111ea5c9-b41e-474d-b6f6-1083e18f6338 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.936870] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58045417-8b53-40c3-bc1f-3f847d3610e4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.967014] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a9c14682-d6d7-43a0-b489-bd3f01a5cc17 could not be found. [ 1465.967307] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1465.967527] env[60788]: INFO nova.compute.manager [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1465.967774] env[60788]: DEBUG oslo.service.loopingcall [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1465.970115] env[60788]: DEBUG nova.compute.manager [-] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1465.970225] env[60788]: DEBUG nova.network.neutron [-] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1466.092676] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86a0496e-f0f2-473b-8b74-480ec031961e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.101749] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63a5863e-112a-4cef-9d7b-1c1300afb09a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.131771] env[60788]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60788) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1466.132015] env[60788]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-5f2c2134-15ec-4a7a-8494-b06d880a8a55'] [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1466.132586] env[60788]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1466.133091] env[60788]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.133579] env[60788]: ERROR oslo.service.loopingcall [ 1466.134097] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98599d88-de72-4484-9deb-75b40e6469f5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.136140] env[60788]: ERROR nova.compute.manager [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.143030] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10a0fe2f-732b-468a-abce-49f690b63f68 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.156609] env[60788]: DEBUG nova.compute.provider_tree [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1466.165632] env[60788]: DEBUG nova.scheduler.client.report [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1466.172575] env[60788]: ERROR nova.compute.manager [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Traceback (most recent call last): [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] ret = obj(*args, **kwargs) [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] exception_handler_v20(status_code, error_body) [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] raise client_exc(message=error_message, [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Neutron server returns request_ids: ['req-5f2c2134-15ec-4a7a-8494-b06d880a8a55'] [ 1466.172575] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] During handling of the above exception, another exception occurred: [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Traceback (most recent call last): [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._delete_instance(context, instance, bdms) [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._shutdown_instance(context, instance, bdms) [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._try_deallocate_network(context, instance, requested_networks) [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] with excutils.save_and_reraise_exception(): [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1466.172947] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self.force_reraise() [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] raise self.value [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] _deallocate_network_with_retries() [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return evt.wait() [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] result = hub.switch() [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.greenlet.switch() [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1466.173360] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] result = func(*self.args, **self.kw) [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] result = f(*args, **kwargs) [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._deallocate_network( [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self.network_api.deallocate_for_instance( [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] data = neutron.list_ports(**search_opts) [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] ret = obj(*args, **kwargs) [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.list('ports', self.ports_path, retrieve_all, [ 1466.173706] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] ret = obj(*args, **kwargs) [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] for r in self._pagination(collection, path, **params): [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] res = self.get(path, params=params) [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] ret = obj(*args, **kwargs) [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.retry_request("GET", action, body=body, [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] ret = obj(*args, **kwargs) [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1466.174040] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] return self.do_request(method, action, body=body, [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] ret = obj(*args, **kwargs) [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] self._handle_fault_response(status_code, replybody, resp) [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.174355] env[60788]: ERROR nova.compute.manager [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] [ 1466.182907] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1466.183452] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1466.200861] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.390s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1466.202078] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 390.449s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1466.202276] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] During sync_power_state the instance has a pending task (deleting). Skip. [ 1466.202465] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "a9c14682-d6d7-43a0-b489-bd3f01a5cc17" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1466.222141] env[60788]: DEBUG nova.compute.utils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1466.223312] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1466.223522] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1466.232941] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1466.253836] env[60788]: INFO nova.compute.manager [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] [instance: a9c14682-d6d7-43a0-b489-bd3f01a5cc17] Successfully reverted task state from None on failure for instance. [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server [None req-ba7c0429-7014-486c-b6a1-79231bccb599 tempest-ServerShowV257Test-468320663 tempest-ServerShowV257Test-468320663-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-5f2c2134-15ec-4a7a-8494-b06d880a8a55'] [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1466.257434] env[60788]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1466.257900] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3328, in terminate_instance [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3323, in do_terminate_instance [ 1466.258362] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3316, in do_terminate_instance [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3251, in _delete_instance [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3145, in _shutdown_instance [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3059, in _try_deallocate_network [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server raise self.value [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3057, in _try_deallocate_network [ 1466.258805] env[60788]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3046, in _deallocate_network_with_retries [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2266, in _deallocate_network [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.259359] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1466.259860] env[60788]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1466.260337] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1466.260337] env[60788]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1466.260337] env[60788]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1466.260337] env[60788]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1466.260337] env[60788]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1466.260337] env[60788]: ERROR oslo_messaging.rpc.server [ 1466.292031] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1466.317260] env[60788]: DEBUG nova.policy [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9608a7d578f54e3aa974e37153821d4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '936e92b1754a415b9b9d7cff62af1e2b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1466.320637] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1466.320919] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1466.321123] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1466.321350] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1466.321533] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1466.321729] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1466.321980] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1466.322190] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1466.322393] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1466.322589] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1466.322797] env[60788]: DEBUG nova.virt.hardware [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1466.323966] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e10641-b35a-4ce4-b295-ffd25e2b8a3a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.332053] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1718aa8-7559-4f52-9176-f447382ac2ec {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.676868] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Successfully created port: 31a3bddf-73f6-497a-b682-817377ab98b9 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1467.175522] env[60788]: DEBUG nova.compute.manager [req-f73a8504-33e7-4bf2-b41e-ccae5c245772 req-063f6170-21aa-4611-ac21-ea72107de02b service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Received event network-vif-plugged-31a3bddf-73f6-497a-b682-817377ab98b9 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1467.175756] env[60788]: DEBUG oslo_concurrency.lockutils [req-f73a8504-33e7-4bf2-b41e-ccae5c245772 req-063f6170-21aa-4611-ac21-ea72107de02b service nova] Acquiring lock "67c365fa-74b8-4a57-abbc-c143990a0292-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1467.175947] env[60788]: DEBUG oslo_concurrency.lockutils [req-f73a8504-33e7-4bf2-b41e-ccae5c245772 req-063f6170-21aa-4611-ac21-ea72107de02b service nova] Lock "67c365fa-74b8-4a57-abbc-c143990a0292-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1467.176136] env[60788]: DEBUG oslo_concurrency.lockutils [req-f73a8504-33e7-4bf2-b41e-ccae5c245772 req-063f6170-21aa-4611-ac21-ea72107de02b service nova] Lock "67c365fa-74b8-4a57-abbc-c143990a0292-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1467.176354] env[60788]: DEBUG nova.compute.manager [req-f73a8504-33e7-4bf2-b41e-ccae5c245772 req-063f6170-21aa-4611-ac21-ea72107de02b service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] No waiting events found dispatching network-vif-plugged-31a3bddf-73f6-497a-b682-817377ab98b9 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1467.176540] env[60788]: WARNING nova.compute.manager [req-f73a8504-33e7-4bf2-b41e-ccae5c245772 req-063f6170-21aa-4611-ac21-ea72107de02b service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Received unexpected event network-vif-plugged-31a3bddf-73f6-497a-b682-817377ab98b9 for instance with vm_state building and task_state spawning. [ 1467.339935] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Successfully updated port: 31a3bddf-73f6-497a-b682-817377ab98b9 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1467.353630] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "refresh_cache-67c365fa-74b8-4a57-abbc-c143990a0292" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1467.353774] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "refresh_cache-67c365fa-74b8-4a57-abbc-c143990a0292" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1467.353922] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1467.429521] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1467.728288] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Updating instance_info_cache with network_info: [{"id": "31a3bddf-73f6-497a-b682-817377ab98b9", "address": "fa:16:3e:6f:df:96", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31a3bddf-73", "ovs_interfaceid": "31a3bddf-73f6-497a-b682-817377ab98b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1467.741364] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "refresh_cache-67c365fa-74b8-4a57-abbc-c143990a0292" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1467.741662] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance network_info: |[{"id": "31a3bddf-73f6-497a-b682-817377ab98b9", "address": "fa:16:3e:6f:df:96", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31a3bddf-73", "ovs_interfaceid": "31a3bddf-73f6-497a-b682-817377ab98b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1467.742059] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6f:df:96', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed3ffc1d-9f86-4029-857e-6cd1d383edbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '31a3bddf-73f6-497a-b682-817377ab98b9', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1467.749710] env[60788]: DEBUG oslo.service.loopingcall [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1467.750180] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1467.750409] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ceea300-e5d5-4a33-b91d-0dab14ce5b6e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.771058] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1467.771058] env[60788]: value = "task-2205249" [ 1467.771058] env[60788]: _type = "Task" [ 1467.771058] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1467.778690] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205249, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1468.281591] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205249, 'name': CreateVM_Task, 'duration_secs': 0.270731} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1468.282044] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1468.282513] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1468.282687] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1468.283037] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1468.283310] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e7618352-748e-4a35-95c4-be1a91ad17f5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1468.287462] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1468.287462] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525ab8a7-2261-2fca-861d-996767aee960" [ 1468.287462] env[60788]: _type = "Task" [ 1468.287462] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1468.294985] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]525ab8a7-2261-2fca-861d-996767aee960, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1468.798492] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1468.798713] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1468.798932] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1469.205965] env[60788]: DEBUG nova.compute.manager [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Received event network-changed-31a3bddf-73f6-497a-b682-817377ab98b9 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1469.206200] env[60788]: DEBUG nova.compute.manager [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Refreshing instance network info cache due to event network-changed-31a3bddf-73f6-497a-b682-817377ab98b9. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1469.206457] env[60788]: DEBUG oslo_concurrency.lockutils [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] Acquiring lock "refresh_cache-67c365fa-74b8-4a57-abbc-c143990a0292" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1469.206610] env[60788]: DEBUG oslo_concurrency.lockutils [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] Acquired lock "refresh_cache-67c365fa-74b8-4a57-abbc-c143990a0292" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1469.206774] env[60788]: DEBUG nova.network.neutron [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Refreshing network info cache for port 31a3bddf-73f6-497a-b682-817377ab98b9 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1469.434622] env[60788]: DEBUG nova.network.neutron [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Updated VIF entry in instance network info cache for port 31a3bddf-73f6-497a-b682-817377ab98b9. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1469.434983] env[60788]: DEBUG nova.network.neutron [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Updating instance_info_cache with network_info: [{"id": "31a3bddf-73f6-497a-b682-817377ab98b9", "address": "fa:16:3e:6f:df:96", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31a3bddf-73", "ovs_interfaceid": "31a3bddf-73f6-497a-b682-817377ab98b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1469.443130] env[60788]: DEBUG oslo_concurrency.lockutils [req-ec76b376-6260-4459-b2a6-890890b6b1ae req-087a293a-1c60-4b13-abb0-53955a7f5eed service nova] Releasing lock "refresh_cache-67c365fa-74b8-4a57-abbc-c143990a0292" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1470.247194] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1502.753825] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1502.754332] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1503.753567] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.754064] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1504.754439] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1505.749983] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1506.754314] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1506.754645] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1506.754645] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1506.778631] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.778826] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.778933] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.779233] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.779403] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.779531] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.779674] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.779815] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.779938] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.780070] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1506.780196] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1506.780678] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1507.754245] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1507.767943] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1507.768235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1507.768320] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1507.768450] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1507.769587] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2549bfe-11de-4c26-b166-87ee56c6dc52 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.778181] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff1b1174-c3d2-4a1b-8507-d88cb6a9d272 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.792756] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdbd4fdb-ec97-4312-a333-0774cb36e978 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.798728] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdfb9591-d2eb-4a4b-864e-58230c3f670c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1507.827815] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181256MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1507.828015] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1507.828165] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1507.901567] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e5084b03-325e-40db-9ffc-0467d53adf38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.901746] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 529472d7-5e71-4997-96de-64d41b9d3515 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.901876] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902067] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902212] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902332] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902458] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902564] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902680] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.902792] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1507.914234] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1507.924593] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1507.924822] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1507.924972] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1508.062049] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-305b11d6-e8c6-4855-ab9b-87b6d7ddb869 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.068445] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dc570d7-e5b3-47c4-9597-0c191c0e4997 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.098017] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f66abda3-1031-4023-be11-f22f4674774a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.104885] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e99c9659-be7a-4c0b-bb7b-ac16cc7a2de2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1508.117669] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1508.126154] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1508.140496] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1508.140732] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1509.141159] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1512.493721] env[60788]: WARNING oslo_vmware.rw_handles [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1512.493721] env[60788]: ERROR oslo_vmware.rw_handles [ 1512.494484] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1512.496041] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1512.496298] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Copying Virtual Disk [datastore2] vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/679363e4-1337-4be4-82bf-632e8c73c16f/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1512.496583] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-38090fa3-2976-468d-9d7c-e730829a92a3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.504607] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Waiting for the task: (returnval){ [ 1512.504607] env[60788]: value = "task-2205250" [ 1512.504607] env[60788]: _type = "Task" [ 1512.504607] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1512.512219] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Task: {'id': task-2205250, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.016577] env[60788]: DEBUG oslo_vmware.exceptions [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1513.016879] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1513.017467] env[60788]: ERROR nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1513.017467] env[60788]: Faults: ['InvalidArgument'] [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Traceback (most recent call last): [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] yield resources [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self.driver.spawn(context, instance, image_meta, [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self._fetch_image_if_missing(context, vi) [ 1513.017467] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] image_cache(vi, tmp_image_ds_loc) [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] vm_util.copy_virtual_disk( [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] session._wait_for_task(vmdk_copy_task) [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] return self.wait_for_task(task_ref) [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] return evt.wait() [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] result = hub.switch() [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1513.017908] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] return self.greenlet.switch() [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self.f(*self.args, **self.kw) [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] raise exceptions.translate_fault(task_info.error) [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Faults: ['InvalidArgument'] [ 1513.018258] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] [ 1513.018258] env[60788]: INFO nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Terminating instance [ 1513.019300] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1513.019519] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1513.019769] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-28df5e9e-75f2-47a8-a517-08ccf833cc33 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.022060] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1513.022314] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1513.023072] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f11a79a1-1cb0-4acc-ba01-cd784d1863a3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.030051] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1513.030274] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e69b8c3-fea2-4767-a1f8-53258eb7ef93 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.032458] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1513.032631] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1513.033619] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c7b7b9bb-ac83-4347-929d-38f15f281239 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.038657] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1513.038657] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525c2e3d-8209-9213-647d-5d2c97d84f42" [ 1513.038657] env[60788]: _type = "Task" [ 1513.038657] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1513.049929] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]525c2e3d-8209-9213-647d-5d2c97d84f42, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.099401] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1513.099876] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1513.099876] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Deleting the datastore file [datastore2] e5084b03-325e-40db-9ffc-0467d53adf38 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1513.100086] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c542146b-d360-4563-93db-a9ec98f431ed {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.106915] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Waiting for the task: (returnval){ [ 1513.106915] env[60788]: value = "task-2205252" [ 1513.106915] env[60788]: _type = "Task" [ 1513.106915] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1513.114523] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Task: {'id': task-2205252, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.552761] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1513.553082] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1513.553308] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-20ac56f1-ba9b-4a2e-a95b-61b208201f79 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.565325] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1513.565506] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Fetch image to [datastore2] vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1513.565681] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1513.566524] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-283068b1-31e3-42e0-b7d7-cdb46758bbdd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.573445] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5bc4bf8-c8d3-4104-9f54-03db0e64b5a2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.582555] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e0fc70-c232-45e7-a391-1f2f495ba691 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.616897] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a862d466-13e3-4734-a313-d3ffeeacc0f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.626421] env[60788]: DEBUG oslo_vmware.api [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Task: {'id': task-2205252, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080401} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1513.626648] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4eef8074-6bf8-42d6-9a4c-9dbd80c9fc5d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.628398] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1513.628596] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1513.628768] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1513.628938] env[60788]: INFO nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1513.630998] env[60788]: DEBUG nova.compute.claims [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1513.631190] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1513.631405] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1513.649340] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1513.705249] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1513.767073] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1513.767286] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1513.873699] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cde88917-d87a-4563-ad66-293d51253372 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.881826] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf00e28e-2f27-4d44-94d9-804af38d0357 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.915582] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca69d16c-c05e-4916-898f-82a2c263a942 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.923063] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-141cc4ad-f0d6-457e-bfb7-9c8542eb1f4d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.936645] env[60788]: DEBUG nova.compute.provider_tree [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1513.947970] env[60788]: DEBUG nova.scheduler.client.report [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1513.966975] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.335s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1513.967594] env[60788]: ERROR nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1513.967594] env[60788]: Faults: ['InvalidArgument'] [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Traceback (most recent call last): [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self.driver.spawn(context, instance, image_meta, [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self._fetch_image_if_missing(context, vi) [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] image_cache(vi, tmp_image_ds_loc) [ 1513.967594] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] vm_util.copy_virtual_disk( [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] session._wait_for_task(vmdk_copy_task) [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] return self.wait_for_task(task_ref) [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] return evt.wait() [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] result = hub.switch() [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] return self.greenlet.switch() [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1513.967896] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] self.f(*self.args, **self.kw) [ 1513.968208] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1513.968208] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] raise exceptions.translate_fault(task_info.error) [ 1513.968208] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1513.968208] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Faults: ['InvalidArgument'] [ 1513.968208] env[60788]: ERROR nova.compute.manager [instance: e5084b03-325e-40db-9ffc-0467d53adf38] [ 1513.968327] env[60788]: DEBUG nova.compute.utils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1513.970118] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Build of instance e5084b03-325e-40db-9ffc-0467d53adf38 was re-scheduled: A specified parameter was not correct: fileType [ 1513.970118] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1513.970856] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1513.971122] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1513.971382] env[60788]: DEBUG nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1513.971650] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1514.495583] env[60788]: DEBUG nova.network.neutron [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1514.515366] env[60788]: INFO nova.compute.manager [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Took 0.54 seconds to deallocate network for instance. [ 1514.688118] env[60788]: INFO nova.scheduler.client.report [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Deleted allocations for instance e5084b03-325e-40db-9ffc-0467d53adf38 [ 1514.721125] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9289b733-0805-4241-bcac-a6ac83f0e4af tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "e5084b03-325e-40db-9ffc-0467d53adf38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 639.824s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.724026] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "e5084b03-325e-40db-9ffc-0467d53adf38" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 442.452s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.724026] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Acquiring lock "e5084b03-325e-40db-9ffc-0467d53adf38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1514.724026] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "e5084b03-325e-40db-9ffc-0467d53adf38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.724291] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "e5084b03-325e-40db-9ffc-0467d53adf38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.725026] env[60788]: INFO nova.compute.manager [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Terminating instance [ 1514.726978] env[60788]: DEBUG nova.compute.manager [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1514.727549] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1514.727717] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f311ffd5-2c00-4522-bfa0-6b8553c396ef {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.736424] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8e560d5-bd44-4448-b465-0fc23e21d57e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.747547] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1514.769239] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e5084b03-325e-40db-9ffc-0467d53adf38 could not be found. [ 1514.771279] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1514.771279] env[60788]: INFO nova.compute.manager [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1514.771279] env[60788]: DEBUG oslo.service.loopingcall [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1514.771279] env[60788]: DEBUG nova.compute.manager [-] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1514.771279] env[60788]: DEBUG nova.network.neutron [-] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1514.846440] env[60788]: DEBUG nova.network.neutron [-] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1514.860264] env[60788]: INFO nova.compute.manager [-] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] Took 0.09 seconds to deallocate network for instance. [ 1514.868308] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1514.868308] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.868308] env[60788]: INFO nova.compute.claims [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1514.993705] env[60788]: DEBUG oslo_concurrency.lockutils [None req-53d6bd5b-7533-478c-b17f-962fc12c2211 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090 tempest-FloatingIPsAssociationNegativeTestJSON-1795614090-project-member] Lock "e5084b03-325e-40db-9ffc-0467d53adf38" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.271s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.994636] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "e5084b03-325e-40db-9ffc-0467d53adf38" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 439.242s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.995426] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e5084b03-325e-40db-9ffc-0467d53adf38] During sync_power_state the instance has a pending task (deleting). Skip. [ 1514.995426] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "e5084b03-325e-40db-9ffc-0467d53adf38" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1515.117582] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3303be4-3ad9-4909-9bf9-cb16bdacaa5b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.125252] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0c0539f-5018-4149-b860-779d7592d04d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.159020] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ff1bf4f-600e-4d31-8c7e-9927543d6862 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.164113] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76bf8062-5b0a-4bdb-9d3d-cc62a6c99a45 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.179019] env[60788]: DEBUG nova.compute.provider_tree [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1515.187218] env[60788]: DEBUG nova.scheduler.client.report [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1515.205435] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1515.205940] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1515.242565] env[60788]: DEBUG nova.compute.utils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1515.245993] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1515.245993] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1515.264049] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1515.310773] env[60788]: DEBUG nova.policy [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '571aaecebbc249e3ae4d9306e1e109ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e80c355190594f5a960ca2d14c3f010c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1515.334922] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1515.362016] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1515.362777] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1515.362961] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1515.363331] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1515.363629] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1515.363831] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1515.364096] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1515.364282] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1515.365285] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1515.365285] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1515.365285] env[60788]: DEBUG nova.virt.hardware [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1515.366168] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e816317-9e74-4cf0-b82c-33217b53a2d9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.375464] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c3664fa-720f-48da-b0b1-e98655886ed7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.050321] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Successfully created port: ccb1908e-2290-4a85-a670-a3440298eaf7 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1517.177215] env[60788]: DEBUG nova.compute.manager [req-17a746b0-c102-4630-9b66-0881f341fad9 req-1d61d4c7-2cca-4a13-a095-6271ce4b094c service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Received event network-vif-plugged-ccb1908e-2290-4a85-a670-a3440298eaf7 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1517.177215] env[60788]: DEBUG oslo_concurrency.lockutils [req-17a746b0-c102-4630-9b66-0881f341fad9 req-1d61d4c7-2cca-4a13-a095-6271ce4b094c service nova] Acquiring lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1517.177215] env[60788]: DEBUG oslo_concurrency.lockutils [req-17a746b0-c102-4630-9b66-0881f341fad9 req-1d61d4c7-2cca-4a13-a095-6271ce4b094c service nova] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1517.177215] env[60788]: DEBUG oslo_concurrency.lockutils [req-17a746b0-c102-4630-9b66-0881f341fad9 req-1d61d4c7-2cca-4a13-a095-6271ce4b094c service nova] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1517.177541] env[60788]: DEBUG nova.compute.manager [req-17a746b0-c102-4630-9b66-0881f341fad9 req-1d61d4c7-2cca-4a13-a095-6271ce4b094c service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] No waiting events found dispatching network-vif-plugged-ccb1908e-2290-4a85-a670-a3440298eaf7 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1517.178202] env[60788]: WARNING nova.compute.manager [req-17a746b0-c102-4630-9b66-0881f341fad9 req-1d61d4c7-2cca-4a13-a095-6271ce4b094c service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Received unexpected event network-vif-plugged-ccb1908e-2290-4a85-a670-a3440298eaf7 for instance with vm_state building and task_state spawning. [ 1517.498799] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Successfully updated port: ccb1908e-2290-4a85-a670-a3440298eaf7 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1517.511853] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "refresh_cache-e3671c90-83c7-48f3-8b2a-97f34ab2505e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1517.511853] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "refresh_cache-e3671c90-83c7-48f3-8b2a-97f34ab2505e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1517.511853] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1517.595704] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1517.987856] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Updating instance_info_cache with network_info: [{"id": "ccb1908e-2290-4a85-a670-a3440298eaf7", "address": "fa:16:3e:c5:af:b9", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapccb1908e-22", "ovs_interfaceid": "ccb1908e-2290-4a85-a670-a3440298eaf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1518.003479] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "refresh_cache-e3671c90-83c7-48f3-8b2a-97f34ab2505e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1518.003479] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance network_info: |[{"id": "ccb1908e-2290-4a85-a670-a3440298eaf7", "address": "fa:16:3e:c5:af:b9", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapccb1908e-22", "ovs_interfaceid": "ccb1908e-2290-4a85-a670-a3440298eaf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1518.003907] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c5:af:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f4a795c-8718-4a7c-aafe-9da231df10f8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ccb1908e-2290-4a85-a670-a3440298eaf7', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1518.012350] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating folder: Project (e80c355190594f5a960ca2d14c3f010c). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1518.012955] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cf64f824-677c-40d0-b30a-211c087658ba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.024907] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Created folder: Project (e80c355190594f5a960ca2d14c3f010c) in parent group-v449747. [ 1518.025138] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating folder: Instances. Parent ref: group-v449834. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1518.025387] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dd4ae9d0-40b2-4a72-a502-36e2322c6c3e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.034378] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Created folder: Instances in parent group-v449834. [ 1518.034614] env[60788]: DEBUG oslo.service.loopingcall [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1518.034816] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1518.035049] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-164f8edc-055e-4f55-95f2-12d659ddf751 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1518.054508] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1518.054508] env[60788]: value = "task-2205255" [ 1518.054508] env[60788]: _type = "Task" [ 1518.054508] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1518.061922] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205255, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1518.564751] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205255, 'name': CreateVM_Task} progress is 99%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1519.064600] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205255, 'name': CreateVM_Task, 'duration_secs': 0.597893} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1519.064826] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1519.065480] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1519.065643] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1519.065967] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1519.066235] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-736fe0f1-882d-48e1-9208-c4ddab392354 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1519.070520] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 1519.070520] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]529edfe4-06e5-c458-8e67-1eb5abd48e7e" [ 1519.070520] env[60788]: _type = "Task" [ 1519.070520] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1519.079581] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]529edfe4-06e5-c458-8e67-1eb5abd48e7e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1519.220546] env[60788]: DEBUG nova.compute.manager [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Received event network-changed-ccb1908e-2290-4a85-a670-a3440298eaf7 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1519.220546] env[60788]: DEBUG nova.compute.manager [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Refreshing instance network info cache due to event network-changed-ccb1908e-2290-4a85-a670-a3440298eaf7. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1519.221071] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] Acquiring lock "refresh_cache-e3671c90-83c7-48f3-8b2a-97f34ab2505e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1519.221161] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] Acquired lock "refresh_cache-e3671c90-83c7-48f3-8b2a-97f34ab2505e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1519.221367] env[60788]: DEBUG nova.network.neutron [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Refreshing network info cache for port ccb1908e-2290-4a85-a670-a3440298eaf7 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1519.559300] env[60788]: DEBUG nova.network.neutron [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Updated VIF entry in instance network info cache for port ccb1908e-2290-4a85-a670-a3440298eaf7. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1519.559652] env[60788]: DEBUG nova.network.neutron [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Updating instance_info_cache with network_info: [{"id": "ccb1908e-2290-4a85-a670-a3440298eaf7", "address": "fa:16:3e:c5:af:b9", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapccb1908e-22", "ovs_interfaceid": "ccb1908e-2290-4a85-a670-a3440298eaf7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1519.569569] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f0ae554-10ad-4ee0-b352-cd7c18d1042a req-0142952c-5314-4b22-a1f2-ce6af6f66beb service nova] Releasing lock "refresh_cache-e3671c90-83c7-48f3-8b2a-97f34ab2505e" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1519.580703] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1519.580934] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1519.581159] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1522.041010] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "67c365fa-74b8-4a57-abbc-c143990a0292" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1522.783811] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1522.784059] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1528.849932] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1528.850326] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1558.750069] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1560.910968] env[60788]: WARNING oslo_vmware.rw_handles [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1560.910968] env[60788]: ERROR oslo_vmware.rw_handles [ 1560.911653] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1560.913664] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1560.913966] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Copying Virtual Disk [datastore2] vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/e888cc72-9e68-465f-a410-03974cb98b24/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1560.914315] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2eacd6fc-bead-4a41-9ea8-c7e2fd36db27 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.924555] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1560.924555] env[60788]: value = "task-2205256" [ 1560.924555] env[60788]: _type = "Task" [ 1560.924555] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1560.933146] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205256, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.435303] env[60788]: DEBUG oslo_vmware.exceptions [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1561.435592] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1561.436155] env[60788]: ERROR nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1561.436155] env[60788]: Faults: ['InvalidArgument'] [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Traceback (most recent call last): [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] yield resources [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self.driver.spawn(context, instance, image_meta, [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self._fetch_image_if_missing(context, vi) [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1561.436155] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] image_cache(vi, tmp_image_ds_loc) [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] vm_util.copy_virtual_disk( [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] session._wait_for_task(vmdk_copy_task) [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] return self.wait_for_task(task_ref) [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] return evt.wait() [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] result = hub.switch() [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] return self.greenlet.switch() [ 1561.436589] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1561.437011] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self.f(*self.args, **self.kw) [ 1561.437011] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1561.437011] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] raise exceptions.translate_fault(task_info.error) [ 1561.437011] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1561.437011] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Faults: ['InvalidArgument'] [ 1561.437011] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] [ 1561.437011] env[60788]: INFO nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Terminating instance [ 1561.437988] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1561.438210] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1561.438447] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbd9a316-6696-42a6-92cf-c2b895a7da66 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.440731] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1561.440923] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1561.441626] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f5416f6-fa40-44c2-950f-1480492b2b55 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.448081] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1561.448295] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-eb4e7754-90f9-4ad6-ac77-eeab2649f5a3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.450379] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1561.450579] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1561.451532] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bae9e48f-1edd-4002-b97b-275083611f66 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.456229] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1561.456229] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52cddfbe-c637-d9a8-699a-9201fb88f8eb" [ 1561.456229] env[60788]: _type = "Task" [ 1561.456229] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1561.463040] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52cddfbe-c637-d9a8-699a-9201fb88f8eb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.515990] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1561.516233] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1561.516347] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleting the datastore file [datastore2] 529472d7-5e71-4997-96de-64d41b9d3515 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1561.516615] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f7fb4e7c-21ba-46fc-8e6d-d394fb79e6e0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.522686] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1561.522686] env[60788]: value = "task-2205258" [ 1561.522686] env[60788]: _type = "Task" [ 1561.522686] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1561.530215] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205258, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.967663] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1561.968127] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating directory with path [datastore2] vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1561.968333] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-00fd9206-4801-4e94-aa6a-7009f759654f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.979871] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Created directory with path [datastore2] vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1561.980070] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Fetch image to [datastore2] vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1561.980248] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1561.980978] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42cf2311-0b4e-44f6-8b7a-ac7959a305f1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.987714] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85169c3b-e151-44a0-8688-38ab9e93b9ba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.997716] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97ec542b-53a3-4aa1-a495-f7ee86f75ffc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.031018] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be2a76c-8506-4056-986a-4f19cc786313 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.038053] env[60788]: DEBUG oslo_vmware.api [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205258, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074664} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1562.039513] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1562.039718] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1562.039892] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1562.040078] env[60788]: INFO nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1562.041883] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-907c649c-85db-4a5c-b8dc-35e9505cc633 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.043840] env[60788]: DEBUG nova.compute.claims [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1562.044025] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1562.044232] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1562.068307] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1562.209150] env[60788]: DEBUG oslo_vmware.rw_handles [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1562.269125] env[60788]: DEBUG oslo_vmware.rw_handles [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1562.269320] env[60788]: DEBUG oslo_vmware.rw_handles [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1562.298169] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ba84713-f843-4b87-91c1-bc71a685e073 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.305572] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62c2041d-f28b-4c8b-9852-b2a189b4df53 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.336308] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5851d73-31bd-4c8d-88e0-39cc7bf9ea77 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.343033] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a55c0c-4a9b-4d6d-a22c-7bf0230fffc9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.355788] env[60788]: DEBUG nova.compute.provider_tree [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1562.364243] env[60788]: DEBUG nova.scheduler.client.report [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1562.378149] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.334s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.378720] env[60788]: ERROR nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1562.378720] env[60788]: Faults: ['InvalidArgument'] [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Traceback (most recent call last): [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self.driver.spawn(context, instance, image_meta, [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self._fetch_image_if_missing(context, vi) [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] image_cache(vi, tmp_image_ds_loc) [ 1562.378720] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] vm_util.copy_virtual_disk( [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] session._wait_for_task(vmdk_copy_task) [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] return self.wait_for_task(task_ref) [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] return evt.wait() [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] result = hub.switch() [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] return self.greenlet.switch() [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1562.379060] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] self.f(*self.args, **self.kw) [ 1562.379431] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1562.379431] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] raise exceptions.translate_fault(task_info.error) [ 1562.379431] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1562.379431] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Faults: ['InvalidArgument'] [ 1562.379431] env[60788]: ERROR nova.compute.manager [instance: 529472d7-5e71-4997-96de-64d41b9d3515] [ 1562.379431] env[60788]: DEBUG nova.compute.utils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1562.380690] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Build of instance 529472d7-5e71-4997-96de-64d41b9d3515 was re-scheduled: A specified parameter was not correct: fileType [ 1562.380690] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1562.381065] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1562.381241] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1562.381408] env[60788]: DEBUG nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1562.381600] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1562.753583] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1562.753947] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1562.909410] env[60788]: DEBUG nova.network.neutron [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1562.922344] env[60788]: INFO nova.compute.manager [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Took 0.54 seconds to deallocate network for instance. [ 1563.056262] env[60788]: INFO nova.scheduler.client.report [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted allocations for instance 529472d7-5e71-4997-96de-64d41b9d3515 [ 1563.105020] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4601b7c4-06b7-4c79-83ab-1ec8ec3b8588 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "529472d7-5e71-4997-96de-64d41b9d3515" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.023s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.105020] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "529472d7-5e71-4997-96de-64d41b9d3515" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.677s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1563.105020] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "529472d7-5e71-4997-96de-64d41b9d3515-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1563.105020] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "529472d7-5e71-4997-96de-64d41b9d3515-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1563.105372] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "529472d7-5e71-4997-96de-64d41b9d3515-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.107381] env[60788]: INFO nova.compute.manager [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Terminating instance [ 1563.109556] env[60788]: DEBUG nova.compute.manager [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1563.109786] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1563.110494] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9cfd4d64-0cf0-486a-bd23-453ae12cb0e3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.120203] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf93e4d-d1cb-44f9-b106-36e7a599d66a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.131374] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1563.151386] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 529472d7-5e71-4997-96de-64d41b9d3515 could not be found. [ 1563.151586] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1563.151787] env[60788]: INFO nova.compute.manager [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1563.152042] env[60788]: DEBUG oslo.service.loopingcall [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1563.152267] env[60788]: DEBUG nova.compute.manager [-] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1563.152362] env[60788]: DEBUG nova.network.neutron [-] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1563.177186] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1563.177430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1563.178839] env[60788]: INFO nova.compute.claims [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1563.190789] env[60788]: DEBUG nova.network.neutron [-] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1563.209265] env[60788]: INFO nova.compute.manager [-] [instance: 529472d7-5e71-4997-96de-64d41b9d3515] Took 0.06 seconds to deallocate network for instance. [ 1563.312423] env[60788]: DEBUG oslo_concurrency.lockutils [None req-fab7f003-9b92-4077-a1ff-c833802a11c9 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "529472d7-5e71-4997-96de-64d41b9d3515" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.208s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.390085] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e5bd363-2b21-4bb1-b43c-dba203533341 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.398453] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21f74df8-9d33-4b9c-81c4-a251981a3c2e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.429575] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06572f7f-135e-424e-b77f-ab68e57818a5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.436665] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-351a0090-944f-4f50-bda3-c8d7da593aa4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.449577] env[60788]: DEBUG nova.compute.provider_tree [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1563.457761] env[60788]: DEBUG nova.scheduler.client.report [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1563.469995] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.470467] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1563.508310] env[60788]: DEBUG nova.compute.utils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1563.509630] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1563.509814] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1563.520082] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1563.583055] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1563.606557] env[60788]: DEBUG nova.policy [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a76df7c693b24512b3f6e13f0e279cc8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b0531506ed4843b80fbcf3c09c73aa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1563.611478] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1563.611767] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1563.611967] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1563.612218] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1563.612401] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1563.612793] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1563.613033] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1563.613253] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1563.613457] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1563.613680] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1563.613893] env[60788]: DEBUG nova.virt.hardware [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1563.616232] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d272a695-92fe-4950-8271-45c392f9401f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.625573] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5995de7c-5b50-4fb9-8bcb-545f13baf5af {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.930137] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Successfully created port: 4425909d-9e81-4d50-a83e-05fa29aea2f5 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1564.753254] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1564.753576] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1564.753723] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1564.871234] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Successfully updated port: 4425909d-9e81-4d50-a83e-05fa29aea2f5 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1564.888859] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "refresh_cache-e34c6299-ae90-4e5a-b272-3623dfe876c0" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1564.889034] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired lock "refresh_cache-e34c6299-ae90-4e5a-b272-3623dfe876c0" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1564.889194] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1564.930056] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1565.095792] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Updating instance_info_cache with network_info: [{"id": "4425909d-9e81-4d50-a83e-05fa29aea2f5", "address": "fa:16:3e:9e:49:ed", "network": {"id": "73db1047-0c76-4640-8949-602913ca4a2c", "bridge": "br-int", "label": "tempest-ServersTestJSON-360084334-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70b0531506ed4843b80fbcf3c09c73aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9aa05ef8-c7bb-4af5-983f-bfa0f3f88223", "external-id": "nsx-vlan-transportzone-135", "segmentation_id": 135, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4425909d-9e", "ovs_interfaceid": "4425909d-9e81-4d50-a83e-05fa29aea2f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1565.106911] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Releasing lock "refresh_cache-e34c6299-ae90-4e5a-b272-3623dfe876c0" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1565.107384] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance network_info: |[{"id": "4425909d-9e81-4d50-a83e-05fa29aea2f5", "address": "fa:16:3e:9e:49:ed", "network": {"id": "73db1047-0c76-4640-8949-602913ca4a2c", "bridge": "br-int", "label": "tempest-ServersTestJSON-360084334-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70b0531506ed4843b80fbcf3c09c73aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9aa05ef8-c7bb-4af5-983f-bfa0f3f88223", "external-id": "nsx-vlan-transportzone-135", "segmentation_id": 135, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4425909d-9e", "ovs_interfaceid": "4425909d-9e81-4d50-a83e-05fa29aea2f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1565.107631] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9e:49:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9aa05ef8-c7bb-4af5-983f-bfa0f3f88223', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4425909d-9e81-4d50-a83e-05fa29aea2f5', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1565.115213] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating folder: Project (70b0531506ed4843b80fbcf3c09c73aa). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1565.115801] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92f9147b-63f5-4069-94f7-73e588d3a747 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.127939] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Created folder: Project (70b0531506ed4843b80fbcf3c09c73aa) in parent group-v449747. [ 1565.128171] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating folder: Instances. Parent ref: group-v449837. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1565.128416] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37d7262c-23bc-495e-b278-811018f336f1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.137723] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Created folder: Instances in parent group-v449837. [ 1565.137968] env[60788]: DEBUG oslo.service.loopingcall [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1565.138170] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1565.138384] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2ea26643-bfd4-4d43-aa7d-9c11954a46c5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1565.157892] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1565.157892] env[60788]: value = "task-2205261" [ 1565.157892] env[60788]: _type = "Task" [ 1565.157892] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1565.165525] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205261, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1565.242575] env[60788]: DEBUG nova.compute.manager [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Received event network-vif-plugged-4425909d-9e81-4d50-a83e-05fa29aea2f5 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1565.242959] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] Acquiring lock "e34c6299-ae90-4e5a-b272-3623dfe876c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1565.243264] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1565.243459] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1565.243728] env[60788]: DEBUG nova.compute.manager [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] No waiting events found dispatching network-vif-plugged-4425909d-9e81-4d50-a83e-05fa29aea2f5 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1565.243905] env[60788]: WARNING nova.compute.manager [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Received unexpected event network-vif-plugged-4425909d-9e81-4d50-a83e-05fa29aea2f5 for instance with vm_state building and task_state spawning. [ 1565.244102] env[60788]: DEBUG nova.compute.manager [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Received event network-changed-4425909d-9e81-4d50-a83e-05fa29aea2f5 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1565.244269] env[60788]: DEBUG nova.compute.manager [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Refreshing instance network info cache due to event network-changed-4425909d-9e81-4d50-a83e-05fa29aea2f5. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1565.244479] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] Acquiring lock "refresh_cache-e34c6299-ae90-4e5a-b272-3623dfe876c0" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1565.244631] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] Acquired lock "refresh_cache-e34c6299-ae90-4e5a-b272-3623dfe876c0" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1565.244796] env[60788]: DEBUG nova.network.neutron [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Refreshing network info cache for port 4425909d-9e81-4d50-a83e-05fa29aea2f5 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1565.671801] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205261, 'name': CreateVM_Task} progress is 99%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1565.957190] env[60788]: DEBUG nova.network.neutron [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Updated VIF entry in instance network info cache for port 4425909d-9e81-4d50-a83e-05fa29aea2f5. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1565.957575] env[60788]: DEBUG nova.network.neutron [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Updating instance_info_cache with network_info: [{"id": "4425909d-9e81-4d50-a83e-05fa29aea2f5", "address": "fa:16:3e:9e:49:ed", "network": {"id": "73db1047-0c76-4640-8949-602913ca4a2c", "bridge": "br-int", "label": "tempest-ServersTestJSON-360084334-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70b0531506ed4843b80fbcf3c09c73aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9aa05ef8-c7bb-4af5-983f-bfa0f3f88223", "external-id": "nsx-vlan-transportzone-135", "segmentation_id": 135, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4425909d-9e", "ovs_interfaceid": "4425909d-9e81-4d50-a83e-05fa29aea2f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1565.967652] env[60788]: DEBUG oslo_concurrency.lockutils [req-1f134bc3-fbe3-43d3-b60b-f284a9a48727 req-eca42848-2230-4b97-8add-7723514e0190 service nova] Releasing lock "refresh_cache-e34c6299-ae90-4e5a-b272-3623dfe876c0" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1566.172665] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205261, 'name': CreateVM_Task, 'duration_secs': 0.521033} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1566.172880] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1566.173527] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1566.173699] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1566.174027] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1566.174273] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b0c65a9e-41c1-46f4-a056-b1258b86ca6e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1566.178256] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for the task: (returnval){ [ 1566.178256] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52121f12-ab47-3a35-1393-2427c94b8b00" [ 1566.178256] env[60788]: _type = "Task" [ 1566.178256] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1566.186512] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52121f12-ab47-3a35-1393-2427c94b8b00, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1566.690297] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1566.690596] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1566.690822] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1566.749536] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1566.753239] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1567.753773] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1567.754085] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1567.754085] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1567.775119] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.775295] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.775430] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.775559] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.775684] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.775811] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.775933] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.776195] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.776341] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.776464] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1567.776585] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1568.753876] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1568.767072] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1568.767302] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1568.767476] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1568.767632] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1568.768788] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd079370-8d67-48c1-9517-9fbc4fc97899 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1568.783422] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a05b756-d2d1-461e-94d6-7ca251025d0e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1568.802591] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3c3daee-27a2-4a8f-a223-1137a3b532f9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1568.812466] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86be2829-1ee3-46a8-bece-b31f7c203dd7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1568.843203] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181267MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1568.850435] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1568.850435] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1568.918510] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 28605b2e-9795-47a0-821c-5cf8da077d37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.918675] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.918808] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.918937] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.919073] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.919197] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.919314] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.919431] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.919546] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.919661] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1568.930359] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1568.941709] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1568.941941] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1568.942104] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1569.085189] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c96fbd-656b-4b63-8364-9e0181f1d252 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1569.093150] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bfffb72-b82f-4192-9287-cac22504f748 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1569.126714] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-672d6ead-029e-4675-8cae-49f37bb2856d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1569.134311] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-878bb044-8a85-4069-887c-baf8cc5ad801 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1569.147314] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1569.156185] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1569.170032] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1569.170235] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1570.170559] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1576.242545] env[60788]: DEBUG oslo_concurrency.lockutils [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1597.275528] env[60788]: DEBUG oslo_concurrency.lockutils [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1608.087495] env[60788]: WARNING oslo_vmware.rw_handles [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1608.087495] env[60788]: ERROR oslo_vmware.rw_handles [ 1608.088273] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1608.090298] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1608.090549] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Copying Virtual Disk [datastore2] vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/284119b1-6f25-4e4c-9c00-6c3cc86bf7e8/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1608.090849] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dd4d74b0-16c4-46ae-bcb0-1ecb3e4f1a8b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.099416] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1608.099416] env[60788]: value = "task-2205262" [ 1608.099416] env[60788]: _type = "Task" [ 1608.099416] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1608.107530] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': task-2205262, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1608.609678] env[60788]: DEBUG oslo_vmware.exceptions [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1608.609993] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1608.610597] env[60788]: ERROR nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1608.610597] env[60788]: Faults: ['InvalidArgument'] [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Traceback (most recent call last): [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] yield resources [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self.driver.spawn(context, instance, image_meta, [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self._fetch_image_if_missing(context, vi) [ 1608.610597] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] image_cache(vi, tmp_image_ds_loc) [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] vm_util.copy_virtual_disk( [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] session._wait_for_task(vmdk_copy_task) [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] return self.wait_for_task(task_ref) [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] return evt.wait() [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] result = hub.switch() [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1608.610999] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] return self.greenlet.switch() [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self.f(*self.args, **self.kw) [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] raise exceptions.translate_fault(task_info.error) [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Faults: ['InvalidArgument'] [ 1608.611456] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] [ 1608.611456] env[60788]: INFO nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Terminating instance [ 1608.612576] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1608.612826] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1608.613091] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a9a68b42-ea27-4e15-a5b4-d9b6a02a74b0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.615442] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1608.615636] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1608.616463] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a66bf1bc-b335-4a23-a9d9-1f1d79764eae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.623143] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1608.623382] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cb29d04e-5489-4452-8549-3d1c5947c7f5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.625405] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1608.625584] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1608.626541] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-983ffaf2-e438-4637-b2b3-630ca46f8c1e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.631164] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Waiting for the task: (returnval){ [ 1608.631164] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52e0686b-5abc-15ee-3c7d-713f6884d206" [ 1608.631164] env[60788]: _type = "Task" [ 1608.631164] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1608.645375] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1608.645592] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Creating directory with path [datastore2] vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1608.645801] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-29597aa5-61c0-4af4-9f8a-af07123a5c6c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.664890] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Created directory with path [datastore2] vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1608.665112] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Fetch image to [datastore2] vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1608.665301] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1608.666092] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ee12871-8f6a-44b2-a88c-47c92e9a3694 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.673101] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3dd54824-fea8-4a45-ab40-1d052c3faec7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.682293] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f923ad1c-a3a0-4d6e-9765-a505ba94e772 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.716898] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d653c6b-b441-4c74-8e4c-334c6c9a90d1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.719613] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1608.719809] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1608.719979] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Deleting the datastore file [datastore2] 28605b2e-9795-47a0-821c-5cf8da077d37 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1608.720839] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dfebe150-6a97-4bd3-89a7-f02970e4d4dd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.725184] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d79c14ab-705d-450a-933a-1226b7ff4c6c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1608.728099] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for the task: (returnval){ [ 1608.728099] env[60788]: value = "task-2205264" [ 1608.728099] env[60788]: _type = "Task" [ 1608.728099] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1608.735558] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': task-2205264, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1608.747043] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1608.799938] env[60788]: DEBUG oslo_vmware.rw_handles [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1608.859320] env[60788]: DEBUG oslo_vmware.rw_handles [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1608.859511] env[60788]: DEBUG oslo_vmware.rw_handles [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1609.239265] env[60788]: DEBUG oslo_vmware.api [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Task: {'id': task-2205264, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068724} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1609.239602] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1609.239699] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1609.239875] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1609.240062] env[60788]: INFO nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1609.242108] env[60788]: DEBUG nova.compute.claims [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1609.242284] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1609.242503] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1609.425502] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76b18135-22b6-4d32-bee9-3c4c7987fccb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.432783] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4757c3fc-a441-4da8-859b-235bf7fa1650 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.461680] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fefd45a7-8e9f-4854-93ec-57e468d6b36e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.468214] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f110060-228f-4cc1-922a-3df6e790d6a7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.481663] env[60788]: DEBUG nova.compute.provider_tree [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1609.490333] env[60788]: DEBUG nova.scheduler.client.report [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1609.504695] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.262s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1609.505223] env[60788]: ERROR nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1609.505223] env[60788]: Faults: ['InvalidArgument'] [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Traceback (most recent call last): [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self.driver.spawn(context, instance, image_meta, [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self._fetch_image_if_missing(context, vi) [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] image_cache(vi, tmp_image_ds_loc) [ 1609.505223] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] vm_util.copy_virtual_disk( [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] session._wait_for_task(vmdk_copy_task) [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] return self.wait_for_task(task_ref) [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] return evt.wait() [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] result = hub.switch() [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] return self.greenlet.switch() [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1609.505697] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] self.f(*self.args, **self.kw) [ 1609.506084] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1609.506084] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] raise exceptions.translate_fault(task_info.error) [ 1609.506084] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1609.506084] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Faults: ['InvalidArgument'] [ 1609.506084] env[60788]: ERROR nova.compute.manager [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] [ 1609.506084] env[60788]: DEBUG nova.compute.utils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1609.507187] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Build of instance 28605b2e-9795-47a0-821c-5cf8da077d37 was re-scheduled: A specified parameter was not correct: fileType [ 1609.507187] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1609.507562] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1609.507739] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1609.507906] env[60788]: DEBUG nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1609.508089] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1609.801848] env[60788]: DEBUG nova.network.neutron [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1609.813583] env[60788]: INFO nova.compute.manager [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Took 0.31 seconds to deallocate network for instance. [ 1609.917481] env[60788]: INFO nova.scheduler.client.report [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Deleted allocations for instance 28605b2e-9795-47a0-821c-5cf8da077d37 [ 1609.941405] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e96f675c-46b5-43f4-a335-79b49f21cd2f tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "28605b2e-9795-47a0-821c-5cf8da077d37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.128s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1609.942580] env[60788]: DEBUG oslo_concurrency.lockutils [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "28605b2e-9795-47a0-821c-5cf8da077d37" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 436.018s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1609.942833] env[60788]: DEBUG oslo_concurrency.lockutils [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Acquiring lock "28605b2e-9795-47a0-821c-5cf8da077d37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1609.943054] env[60788]: DEBUG oslo_concurrency.lockutils [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "28605b2e-9795-47a0-821c-5cf8da077d37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1609.943740] env[60788]: DEBUG oslo_concurrency.lockutils [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "28605b2e-9795-47a0-821c-5cf8da077d37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1609.945435] env[60788]: INFO nova.compute.manager [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Terminating instance [ 1609.947756] env[60788]: DEBUG nova.compute.manager [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1609.947756] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1609.948099] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9b780a43-fa6c-4923-a108-4404860582dc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.957955] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47640a58-7420-4fe4-a270-6cd05e0156d3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1609.969394] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1609.992763] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 28605b2e-9795-47a0-821c-5cf8da077d37 could not be found. [ 1609.992763] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1609.992763] env[60788]: INFO nova.compute.manager [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1609.993098] env[60788]: DEBUG oslo.service.loopingcall [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1609.993098] env[60788]: DEBUG nova.compute.manager [-] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1609.993200] env[60788]: DEBUG nova.network.neutron [-] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1610.020346] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1610.020712] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1610.022094] env[60788]: INFO nova.compute.claims [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1610.031734] env[60788]: DEBUG nova.network.neutron [-] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1610.042551] env[60788]: INFO nova.compute.manager [-] [instance: 28605b2e-9795-47a0-821c-5cf8da077d37] Took 0.05 seconds to deallocate network for instance. [ 1610.159973] env[60788]: DEBUG oslo_concurrency.lockutils [None req-46e2a5d4-551a-462d-babb-adc6c1426928 tempest-MultipleCreateTestJSON-2016497294 tempest-MultipleCreateTestJSON-2016497294-project-member] Lock "28605b2e-9795-47a0-821c-5cf8da077d37" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.217s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1610.221922] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f282cff7-36f9-4ee0-8a29-882ff6fc19d4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.229781] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66d6c572-7562-4e88-9b49-9b559f64dea1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.260221] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34fa90c3-78ac-40d2-8514-0fd0d13531e9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.267196] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f82022c-b4a8-430f-a1ab-bcb674fb2e52 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.280087] env[60788]: DEBUG nova.compute.provider_tree [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1610.288207] env[60788]: DEBUG nova.scheduler.client.report [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1610.304931] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1610.305413] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1610.338438] env[60788]: DEBUG nova.compute.utils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1610.339922] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1610.340111] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1610.348106] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1610.395317] env[60788]: DEBUG nova.policy [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '06700e2ecbed438fbdedf19260912e02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '095118217f1e469f82fddf907db84df0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1610.408525] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1610.434267] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1610.434521] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1610.434685] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1610.434860] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1610.435008] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1610.435163] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1610.435370] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1610.435533] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1610.435783] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1610.435853] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1610.436040] env[60788]: DEBUG nova.virt.hardware [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1610.436887] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5449d4c2-fe0c-4975-b946-9eab26beecf2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.445240] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e42d481-d5bd-4122-a2e5-d4be2237cdbf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.748304] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Successfully created port: 9c9d5773-2725-4089-a348-d4a1f2d6f0e7 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1611.732840] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Successfully updated port: 9c9d5773-2725-4089-a348-d4a1f2d6f0e7 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1611.756566] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "refresh_cache-ded19ccc-a92f-4d3e-8659-593a1aab1651" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1611.756727] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquired lock "refresh_cache-ded19ccc-a92f-4d3e-8659-593a1aab1651" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1611.757100] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1611.795118] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1611.841331] env[60788]: DEBUG nova.compute.manager [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Received event network-vif-plugged-9c9d5773-2725-4089-a348-d4a1f2d6f0e7 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1611.841635] env[60788]: DEBUG oslo_concurrency.lockutils [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] Acquiring lock "ded19ccc-a92f-4d3e-8659-593a1aab1651-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1611.841842] env[60788]: DEBUG oslo_concurrency.lockutils [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1611.842019] env[60788]: DEBUG oslo_concurrency.lockutils [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.842193] env[60788]: DEBUG nova.compute.manager [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] No waiting events found dispatching network-vif-plugged-9c9d5773-2725-4089-a348-d4a1f2d6f0e7 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1611.842369] env[60788]: WARNING nova.compute.manager [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Received unexpected event network-vif-plugged-9c9d5773-2725-4089-a348-d4a1f2d6f0e7 for instance with vm_state building and task_state spawning. [ 1611.842638] env[60788]: DEBUG nova.compute.manager [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Received event network-changed-9c9d5773-2725-4089-a348-d4a1f2d6f0e7 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1611.842690] env[60788]: DEBUG nova.compute.manager [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Refreshing instance network info cache due to event network-changed-9c9d5773-2725-4089-a348-d4a1f2d6f0e7. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1611.842841] env[60788]: DEBUG oslo_concurrency.lockutils [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] Acquiring lock "refresh_cache-ded19ccc-a92f-4d3e-8659-593a1aab1651" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1611.964494] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Updating instance_info_cache with network_info: [{"id": "9c9d5773-2725-4089-a348-d4a1f2d6f0e7", "address": "fa:16:3e:24:a8:cb", "network": {"id": "78ae2181-5ef4-4a5f-8b55-2cf816a8de6a", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1283543543-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "095118217f1e469f82fddf907db84df0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e7a0d5af-5be9-477a-837c-58ef55c717f4", "external-id": "nsx-vlan-transportzone-598", "segmentation_id": 598, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c9d5773-27", "ovs_interfaceid": "9c9d5773-2725-4089-a348-d4a1f2d6f0e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1611.977200] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Releasing lock "refresh_cache-ded19ccc-a92f-4d3e-8659-593a1aab1651" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1611.977492] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance network_info: |[{"id": "9c9d5773-2725-4089-a348-d4a1f2d6f0e7", "address": "fa:16:3e:24:a8:cb", "network": {"id": "78ae2181-5ef4-4a5f-8b55-2cf816a8de6a", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1283543543-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "095118217f1e469f82fddf907db84df0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e7a0d5af-5be9-477a-837c-58ef55c717f4", "external-id": "nsx-vlan-transportzone-598", "segmentation_id": 598, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c9d5773-27", "ovs_interfaceid": "9c9d5773-2725-4089-a348-d4a1f2d6f0e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1611.977791] env[60788]: DEBUG oslo_concurrency.lockutils [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] Acquired lock "refresh_cache-ded19ccc-a92f-4d3e-8659-593a1aab1651" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1611.977974] env[60788]: DEBUG nova.network.neutron [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Refreshing network info cache for port 9c9d5773-2725-4089-a348-d4a1f2d6f0e7 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1611.979190] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:a8:cb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e7a0d5af-5be9-477a-837c-58ef55c717f4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9c9d5773-2725-4089-a348-d4a1f2d6f0e7', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1611.986812] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Creating folder: Project (095118217f1e469f82fddf907db84df0). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1611.990116] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6377132-5d31-4567-8913-7df0e31ced6b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.001698] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Created folder: Project (095118217f1e469f82fddf907db84df0) in parent group-v449747. [ 1612.001885] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Creating folder: Instances. Parent ref: group-v449840. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1612.002142] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9fd86a1c-17d1-4cba-a210-3ef2e76e6f01 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.011697] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Created folder: Instances in parent group-v449840. [ 1612.011921] env[60788]: DEBUG oslo.service.loopingcall [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1612.012121] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1612.012340] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f21b075c-1812-4fc4-8e61-f313005568f6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.032522] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1612.032522] env[60788]: value = "task-2205267" [ 1612.032522] env[60788]: _type = "Task" [ 1612.032522] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1612.039884] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205267, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1612.315557] env[60788]: DEBUG nova.network.neutron [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Updated VIF entry in instance network info cache for port 9c9d5773-2725-4089-a348-d4a1f2d6f0e7. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1612.316013] env[60788]: DEBUG nova.network.neutron [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Updating instance_info_cache with network_info: [{"id": "9c9d5773-2725-4089-a348-d4a1f2d6f0e7", "address": "fa:16:3e:24:a8:cb", "network": {"id": "78ae2181-5ef4-4a5f-8b55-2cf816a8de6a", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1283543543-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "095118217f1e469f82fddf907db84df0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e7a0d5af-5be9-477a-837c-58ef55c717f4", "external-id": "nsx-vlan-transportzone-598", "segmentation_id": 598, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c9d5773-27", "ovs_interfaceid": "9c9d5773-2725-4089-a348-d4a1f2d6f0e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1612.326851] env[60788]: DEBUG oslo_concurrency.lockutils [req-0c409ac3-dac8-40d8-ac50-14a6026f319f req-aafa0550-9bde-4d63-9289-73cb6c45cf35 service nova] Releasing lock "refresh_cache-ded19ccc-a92f-4d3e-8659-593a1aab1651" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1612.542498] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205267, 'name': CreateVM_Task} progress is 99%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1613.044130] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205267, 'name': CreateVM_Task, 'duration_secs': 0.532228} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1613.044481] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1613.044961] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1613.045148] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1613.045469] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1613.045711] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b6f5d3c-a873-499a-b785-43a1bfda4b76 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.049878] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Waiting for the task: (returnval){ [ 1613.049878] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52a46c75-2ea6-97b1-8296-dc307f7d89f2" [ 1613.049878] env[60788]: _type = "Task" [ 1613.049878] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1613.057131] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52a46c75-2ea6-97b1-8296-dc307f7d89f2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1613.560778] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1613.561013] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1613.561251] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1622.753589] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1623.754067] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1625.754053] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1625.754385] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1626.755108] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1626.755373] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1627.750128] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1628.754158] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1628.754518] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1628.754518] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1628.776457] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.776576] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.776727] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.776846] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.776957] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.777092] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.777217] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.777338] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.777456] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.777573] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1628.777694] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1628.778264] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1628.789650] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1628.789862] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1628.790060] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1628.790236] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1628.791312] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24953ff6-4ab5-4442-bbb6-3734dbfa742f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.801650] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27db3fd4-e334-44d8-8ef3-b719c1fa09ff {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.815903] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c81a9815-8ead-417b-a2c9-754c41b56952 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.822438] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ca8253-fa9b-41eb-a5ce-df611900c072 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1628.851919] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181257MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1628.852074] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1628.852275] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1628.923279] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f7fa5c24-7ff5-4656-897f-b0164c989207 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.923445] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.923576] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.923700] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.923820] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.924018] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.924163] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.924286] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.924406] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.924525] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1628.935065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1628.935365] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1628.935521] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1629.057243] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818b404b-4bb6-4897-a6c0-7e2f69aebfe1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.064848] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d42596f9-55cd-4085-8747-81d2246759c2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.096433] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01c3887-36a1-4c63-8ec1-5b38c72a1381 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.103230] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32a58168-91ac-4aac-b4bd-c853b79af6da {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.116121] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1629.124566] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1629.138120] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1629.138301] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.286s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1630.113550] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1658.101482] env[60788]: WARNING oslo_vmware.rw_handles [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1658.101482] env[60788]: ERROR oslo_vmware.rw_handles [ 1658.102217] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1658.103897] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1658.104173] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Copying Virtual Disk [datastore2] vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/1af0c076-540b-4315-9909-f3296adf38ca/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1658.104474] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4dbe195a-dfa8-479a-b224-c8e972ec1a11 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.112366] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Waiting for the task: (returnval){ [ 1658.112366] env[60788]: value = "task-2205268" [ 1658.112366] env[60788]: _type = "Task" [ 1658.112366] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1658.119921] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Task: {'id': task-2205268, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1658.622657] env[60788]: DEBUG oslo_vmware.exceptions [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1658.622956] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1658.623542] env[60788]: ERROR nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1658.623542] env[60788]: Faults: ['InvalidArgument'] [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Traceback (most recent call last): [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] yield resources [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self.driver.spawn(context, instance, image_meta, [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self._fetch_image_if_missing(context, vi) [ 1658.623542] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] image_cache(vi, tmp_image_ds_loc) [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] vm_util.copy_virtual_disk( [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] session._wait_for_task(vmdk_copy_task) [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] return self.wait_for_task(task_ref) [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] return evt.wait() [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] result = hub.switch() [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1658.623937] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] return self.greenlet.switch() [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self.f(*self.args, **self.kw) [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] raise exceptions.translate_fault(task_info.error) [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Faults: ['InvalidArgument'] [ 1658.624354] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] [ 1658.624354] env[60788]: INFO nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Terminating instance [ 1658.625456] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1658.625668] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1658.625910] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbae2c10-59e2-4bd1-ba23-58e86ba29069 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.628138] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1658.628330] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1658.629082] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83939010-9bdf-404c-8e76-6641ac55432f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.636113] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1658.636334] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-000da64d-0555-4324-9065-e5d18610822b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.638582] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1658.638754] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1658.639725] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f6f582c9-8740-42a3-b582-ae4c8eddb786 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.644180] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for the task: (returnval){ [ 1658.644180] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]526ac92f-aa57-afa8-ce30-b57610f3db02" [ 1658.644180] env[60788]: _type = "Task" [ 1658.644180] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1658.659411] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1658.659646] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Creating directory with path [datastore2] vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1658.659861] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-07d5e512-43b6-4a15-b8f2-29bb860b8945 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.681130] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Created directory with path [datastore2] vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1658.681324] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Fetch image to [datastore2] vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1658.681495] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1658.682281] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59d0316a-5a9e-4e19-a6a2-9235483c4447 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.689231] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77fa71ce-9e4b-48f2-9589-05f423debe69 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.699751] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7ccbfeb-a87c-4933-bd7c-932c215fa5a9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.734608] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c79e25-d02e-4f3a-a375-0bef389d8f91 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.737351] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1658.737566] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1658.737754] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Deleting the datastore file [datastore2] f7fa5c24-7ff5-4656-897f-b0164c989207 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1658.737997] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e020be8c-c5df-4bd0-875a-35ef54099361 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.743481] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fc1c6b96-9be2-42c7-ad9e-a73ab9449324 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.745240] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Waiting for the task: (returnval){ [ 1658.745240] env[60788]: value = "task-2205270" [ 1658.745240] env[60788]: _type = "Task" [ 1658.745240] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1658.752795] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Task: {'id': task-2205270, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1658.764945] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1658.813534] env[60788]: DEBUG oslo_vmware.rw_handles [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1658.871998] env[60788]: DEBUG oslo_vmware.rw_handles [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1658.872245] env[60788]: DEBUG oslo_vmware.rw_handles [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1659.255348] env[60788]: DEBUG oslo_vmware.api [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Task: {'id': task-2205270, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070078} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1659.255727] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1659.255822] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1659.255935] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1659.256128] env[60788]: INFO nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1659.258294] env[60788]: DEBUG nova.compute.claims [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1659.258462] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1659.258674] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1659.426563] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c402ed91-cf87-4a07-a78a-9f30710a3045 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.434015] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eceea39e-9b71-494d-afd9-2312608b2f09 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.464536] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b91e721b-7f30-4766-9efa-e4d6daa774de {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.471224] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bae41a10-a8aa-4154-a529-416e74cde9f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.483857] env[60788]: DEBUG nova.compute.provider_tree [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1659.491927] env[60788]: DEBUG nova.scheduler.client.report [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1659.507610] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.249s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.508151] env[60788]: ERROR nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1659.508151] env[60788]: Faults: ['InvalidArgument'] [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Traceback (most recent call last): [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self.driver.spawn(context, instance, image_meta, [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self._fetch_image_if_missing(context, vi) [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] image_cache(vi, tmp_image_ds_loc) [ 1659.508151] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] vm_util.copy_virtual_disk( [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] session._wait_for_task(vmdk_copy_task) [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] return self.wait_for_task(task_ref) [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] return evt.wait() [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] result = hub.switch() [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] return self.greenlet.switch() [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1659.508533] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] self.f(*self.args, **self.kw) [ 1659.508913] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1659.508913] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] raise exceptions.translate_fault(task_info.error) [ 1659.508913] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1659.508913] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Faults: ['InvalidArgument'] [ 1659.508913] env[60788]: ERROR nova.compute.manager [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] [ 1659.508913] env[60788]: DEBUG nova.compute.utils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1659.510231] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Build of instance f7fa5c24-7ff5-4656-897f-b0164c989207 was re-scheduled: A specified parameter was not correct: fileType [ 1659.510231] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1659.510590] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1659.510761] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1659.510929] env[60788]: DEBUG nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1659.511105] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1660.186977] env[60788]: DEBUG nova.network.neutron [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1660.200267] env[60788]: INFO nova.compute.manager [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Took 0.69 seconds to deallocate network for instance. [ 1660.298373] env[60788]: INFO nova.scheduler.client.report [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Deleted allocations for instance f7fa5c24-7ff5-4656-897f-b0164c989207 [ 1660.319631] env[60788]: DEBUG oslo_concurrency.lockutils [None req-73b75028-abf3-4492-9086-8ffd8878d56e tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 619.025s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1660.320793] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 422.655s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1660.321036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Acquiring lock "f7fa5c24-7ff5-4656-897f-b0164c989207-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1660.321254] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1660.321426] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1660.323579] env[60788]: INFO nova.compute.manager [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Terminating instance [ 1660.325355] env[60788]: DEBUG nova.compute.manager [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1660.325553] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1660.326246] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-daba16bf-b917-4cea-9ed3-d62cf8a5b7b0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.336634] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c345ae53-bd19-4bda-bb1d-6a1f9ae80963 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.347537] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1660.367425] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f7fa5c24-7ff5-4656-897f-b0164c989207 could not be found. [ 1660.367600] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1660.367763] env[60788]: INFO nova.compute.manager [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1660.368234] env[60788]: DEBUG oslo.service.loopingcall [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1660.368344] env[60788]: DEBUG nova.compute.manager [-] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1660.368410] env[60788]: DEBUG nova.network.neutron [-] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1660.391507] env[60788]: DEBUG nova.network.neutron [-] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1660.394530] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1660.394786] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1660.396215] env[60788]: INFO nova.compute.claims [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1660.400839] env[60788]: INFO nova.compute.manager [-] [instance: f7fa5c24-7ff5-4656-897f-b0164c989207] Took 0.03 seconds to deallocate network for instance. [ 1660.491012] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b4e3f5ba-4eb0-4320-9e32-0030a205e9fc tempest-ServerAddressesTestJSON-68787833 tempest-ServerAddressesTestJSON-68787833-project-member] Lock "f7fa5c24-7ff5-4656-897f-b0164c989207" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1660.580294] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8af57ac0-95a4-462f-940e-e0ca0f28b586 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.588300] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40bd2c79-cd9a-4c7d-81e9-1b5251ca653d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.617588] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09eb2c73-d4c3-4bfa-9802-8158e4a47f67 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.624139] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-015ac4bc-be18-411a-b3d4-f5c35320b147 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.636853] env[60788]: DEBUG nova.compute.provider_tree [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1660.645266] env[60788]: DEBUG nova.scheduler.client.report [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1660.660878] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1660.661236] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1660.694953] env[60788]: DEBUG nova.compute.utils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1660.696548] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1660.696742] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1660.704631] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1660.752871] env[60788]: DEBUG nova.policy [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89b673319ed34de9859c0f58f1c616c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d4606e74dad40acba2d78ea01a69919', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1660.768016] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1660.797082] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1660.797347] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1660.797505] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1660.797688] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1660.797840] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1660.797990] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1660.798220] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1660.798383] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1660.798550] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1660.798713] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1660.798883] env[60788]: DEBUG nova.virt.hardware [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1660.799813] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d61196cc-7876-4707-9fd3-670b9db19daa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.807797] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e000d5e-6ea1-4c86-812f-c6381af3eb1c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1661.568196] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Successfully created port: a19ddb5b-2cc1-4179-a77d-9b44d7509b68 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1662.227064] env[60788]: DEBUG nova.compute.manager [req-0b3903ec-cedd-4f10-a6c4-fe969d47b86d req-9fe3cdea-ea76-47cb-9027-8545444abf42 service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Received event network-vif-plugged-a19ddb5b-2cc1-4179-a77d-9b44d7509b68 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1662.227244] env[60788]: DEBUG oslo_concurrency.lockutils [req-0b3903ec-cedd-4f10-a6c4-fe969d47b86d req-9fe3cdea-ea76-47cb-9027-8545444abf42 service nova] Acquiring lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1662.227461] env[60788]: DEBUG oslo_concurrency.lockutils [req-0b3903ec-cedd-4f10-a6c4-fe969d47b86d req-9fe3cdea-ea76-47cb-9027-8545444abf42 service nova] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1662.227608] env[60788]: DEBUG oslo_concurrency.lockutils [req-0b3903ec-cedd-4f10-a6c4-fe969d47b86d req-9fe3cdea-ea76-47cb-9027-8545444abf42 service nova] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1662.227774] env[60788]: DEBUG nova.compute.manager [req-0b3903ec-cedd-4f10-a6c4-fe969d47b86d req-9fe3cdea-ea76-47cb-9027-8545444abf42 service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] No waiting events found dispatching network-vif-plugged-a19ddb5b-2cc1-4179-a77d-9b44d7509b68 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1662.227934] env[60788]: WARNING nova.compute.manager [req-0b3903ec-cedd-4f10-a6c4-fe969d47b86d req-9fe3cdea-ea76-47cb-9027-8545444abf42 service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Received unexpected event network-vif-plugged-a19ddb5b-2cc1-4179-a77d-9b44d7509b68 for instance with vm_state building and task_state spawning. [ 1662.259756] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Successfully updated port: a19ddb5b-2cc1-4179-a77d-9b44d7509b68 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1662.273587] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "refresh_cache-7cc29f7d-e708-44a9-8ab6-5204163e9c96" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1662.273587] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "refresh_cache-7cc29f7d-e708-44a9-8ab6-5204163e9c96" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1662.273587] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1662.341562] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1662.677750] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Updating instance_info_cache with network_info: [{"id": "a19ddb5b-2cc1-4179-a77d-9b44d7509b68", "address": "fa:16:3e:15:91:15", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa19ddb5b-2c", "ovs_interfaceid": "a19ddb5b-2cc1-4179-a77d-9b44d7509b68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1662.689380] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "refresh_cache-7cc29f7d-e708-44a9-8ab6-5204163e9c96" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1662.689684] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance network_info: |[{"id": "a19ddb5b-2cc1-4179-a77d-9b44d7509b68", "address": "fa:16:3e:15:91:15", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa19ddb5b-2c", "ovs_interfaceid": "a19ddb5b-2cc1-4179-a77d-9b44d7509b68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1662.690081] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:15:91:15', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a19ddb5b-2cc1-4179-a77d-9b44d7509b68', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1662.697870] env[60788]: DEBUG oslo.service.loopingcall [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1662.698341] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1662.698574] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3b3e0c5e-01a5-4d0e-8c2b-b8f52d4da8e9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1662.718316] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1662.718316] env[60788]: value = "task-2205271" [ 1662.718316] env[60788]: _type = "Task" [ 1662.718316] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1662.725811] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205271, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1663.229169] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205271, 'name': CreateVM_Task, 'duration_secs': 0.301479} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1663.229371] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1663.230043] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1663.230226] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1663.230568] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1663.230825] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2fbd6e2d-bee2-4834-8a11-7ec821145837 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1663.235230] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1663.235230] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52307ebe-da38-094d-1f05-06c627a3be85" [ 1663.235230] env[60788]: _type = "Task" [ 1663.235230] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1663.249424] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52307ebe-da38-094d-1f05-06c627a3be85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1663.746699] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1663.746988] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1663.747223] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1664.253353] env[60788]: DEBUG nova.compute.manager [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Received event network-changed-a19ddb5b-2cc1-4179-a77d-9b44d7509b68 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1664.253570] env[60788]: DEBUG nova.compute.manager [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Refreshing instance network info cache due to event network-changed-a19ddb5b-2cc1-4179-a77d-9b44d7509b68. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1664.253762] env[60788]: DEBUG oslo_concurrency.lockutils [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] Acquiring lock "refresh_cache-7cc29f7d-e708-44a9-8ab6-5204163e9c96" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1664.253906] env[60788]: DEBUG oslo_concurrency.lockutils [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] Acquired lock "refresh_cache-7cc29f7d-e708-44a9-8ab6-5204163e9c96" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1664.254274] env[60788]: DEBUG nova.network.neutron [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Refreshing network info cache for port a19ddb5b-2cc1-4179-a77d-9b44d7509b68 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1664.485790] env[60788]: DEBUG nova.network.neutron [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Updated VIF entry in instance network info cache for port a19ddb5b-2cc1-4179-a77d-9b44d7509b68. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1664.486199] env[60788]: DEBUG nova.network.neutron [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Updating instance_info_cache with network_info: [{"id": "a19ddb5b-2cc1-4179-a77d-9b44d7509b68", "address": "fa:16:3e:15:91:15", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa19ddb5b-2c", "ovs_interfaceid": "a19ddb5b-2cc1-4179-a77d-9b44d7509b68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1664.496106] env[60788]: DEBUG oslo_concurrency.lockutils [req-f842bb48-cb81-4325-90bf-88e1224d4186 req-41151446-2d5c-4025-92ee-ab746c7b984a service nova] Releasing lock "refresh_cache-7cc29f7d-e708-44a9-8ab6-5204163e9c96" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1668.144365] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "4864273c-b505-4e31-bf7b-633ba1e99562" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1668.144679] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "4864273c-b505-4e31-bf7b-633ba1e99562" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.753648] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1680.756671] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1682.708434] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1682.729183] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Getting list of instances from cluster (obj){ [ 1682.729183] env[60788]: value = "domain-c8" [ 1682.729183] env[60788]: _type = "ClusterComputeResource" [ 1682.729183] env[60788]: } {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1682.730539] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55ec0d1c-5ae4-4eae-951d-bde6ccd1a9e9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1682.748649] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Got total of 10 instances {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1682.748838] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid d63f9834-818b-4087-851c-d7394d20b89d {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749049] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 688ff077-9505-48f5-9117-0a7f115f254c {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749221] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 58bbe972-5fc1-4627-90e4-91251e047e86 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749379] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid fb532f8b-5323-4f7a-be64-c6076a1862ae {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749534] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid f08a350c-54b6-44ce-bb3f-b9ab5deacf9d {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749685] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 67c365fa-74b8-4a57-abbc-c143990a0292 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749836] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid e3671c90-83c7-48f3-8b2a-97f34ab2505e {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.749988] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid e34c6299-ae90-4e5a-b272-3623dfe876c0 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.750159] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid ded19ccc-a92f-4d3e-8659-593a1aab1651 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.750311] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 7cc29f7d-e708-44a9-8ab6-5204163e9c96 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 1682.750637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "d63f9834-818b-4087-851c-d7394d20b89d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.750874] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "688ff077-9505-48f5-9117-0a7f115f254c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.751091] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "58bbe972-5fc1-4627-90e4-91251e047e86" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.751300] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.751494] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.751701] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "67c365fa-74b8-4a57-abbc-c143990a0292" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.751898] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.752116] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.752321] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1682.752515] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.798428] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1683.798758] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1686.754391] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1686.754666] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1687.754802] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1688.754555] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1688.754773] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1688.766712] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1688.767052] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1688.767106] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1688.767269] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1688.768402] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-493fceac-9f60-4538-814b-80a7618508c7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1688.777017] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a7e491c-301b-45ee-aa88-8c3a6aa8054d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1688.792708] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ade331c-f832-4000-b37a-68564b23e1fe {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1688.799141] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d83323d6-6e6a-4edd-b042-16048a416e88 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1688.829705] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181221MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1688.829864] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1688.830075] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1688.974420] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance d63f9834-818b-4087-851c-d7394d20b89d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.974598] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.974730] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.974854] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.974973] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.975106] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.975279] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.975398] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.975516] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.975631] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1688.986988] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1688.987220] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1688.987368] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1689.005739] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing inventories for resource provider 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1689.020523] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating ProviderTree inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1689.020821] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1689.034020] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing aggregate associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, aggregates: None {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1689.052252] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing trait associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1689.195098] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82bed6b0-0487-43ce-ac7e-92b5659e5c17 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1689.201348] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d6b8a63-3728-4e95-8435-817c27b8e382 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1689.232358] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ee386cd-dc2f-4b6a-9da8-1b39f729d683 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1689.239712] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0beb9966-5256-463a-af13-793e1ddc283d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1689.253404] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1689.262089] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1689.280753] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1689.280984] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.451s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1690.275610] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1690.275905] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1690.275988] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1690.276128] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1690.296071] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296071] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296288] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296288] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296394] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296543] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296634] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296750] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296866] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.296982] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1690.297120] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1691.753531] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.754202] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1692.754708] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances with incomplete migration {{(pid=60788) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1694.762900] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1694.763244] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1694.771818] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] There are 0 instances to clean {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 1707.052625] env[60788]: WARNING oslo_vmware.rw_handles [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1707.052625] env[60788]: ERROR oslo_vmware.rw_handles [ 1707.053466] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1707.054975] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1707.055237] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Copying Virtual Disk [datastore2] vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/e77360e4-9193-422f-9a17-1fee5637519c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1707.055560] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-421cde0c-765e-42d6-bd0f-7ff1f10e27f8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.064063] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for the task: (returnval){ [ 1707.064063] env[60788]: value = "task-2205272" [ 1707.064063] env[60788]: _type = "Task" [ 1707.064063] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.071719] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Task: {'id': task-2205272, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1707.575015] env[60788]: DEBUG oslo_vmware.exceptions [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1707.575319] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1707.575923] env[60788]: ERROR nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1707.575923] env[60788]: Faults: ['InvalidArgument'] [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] Traceback (most recent call last): [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] yield resources [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self.driver.spawn(context, instance, image_meta, [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self._fetch_image_if_missing(context, vi) [ 1707.575923] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] image_cache(vi, tmp_image_ds_loc) [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] vm_util.copy_virtual_disk( [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] session._wait_for_task(vmdk_copy_task) [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] return self.wait_for_task(task_ref) [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] return evt.wait() [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] result = hub.switch() [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1707.576358] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] return self.greenlet.switch() [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self.f(*self.args, **self.kw) [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] raise exceptions.translate_fault(task_info.error) [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] Faults: ['InvalidArgument'] [ 1707.576788] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] [ 1707.576788] env[60788]: INFO nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Terminating instance [ 1707.577952] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1707.578058] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1707.578228] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e905241b-8488-4bd1-95e0-2e4352804a14 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.580306] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1707.580500] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquired lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1707.580624] env[60788]: DEBUG nova.network.neutron [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1707.587839] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1707.587839] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1707.588986] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e527eac4-b258-420b-92a1-475ed93a2f1e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.596387] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1707.596387] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52a2fdc7-57bf-13dd-8e3a-eefaa8a806a3" [ 1707.596387] env[60788]: _type = "Task" [ 1707.596387] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.604783] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52a2fdc7-57bf-13dd-8e3a-eefaa8a806a3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1707.627013] env[60788]: DEBUG nova.network.neutron [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1707.758017] env[60788]: DEBUG nova.network.neutron [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1707.767608] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Releasing lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1707.768072] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1707.768268] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1707.769339] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f38db0d7-f2a1-4afa-a189-9272911e1805 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.777956] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1707.777956] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b719982b-3ac0-4489-ab0f-9c5e3466592f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.811552] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1707.811777] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1707.811959] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Deleting the datastore file [datastore2] d63f9834-818b-4087-851c-d7394d20b89d {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1707.812238] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bb71f19e-cc18-4192-b3e0-77046ac25bdf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.818400] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for the task: (returnval){ [ 1707.818400] env[60788]: value = "task-2205274" [ 1707.818400] env[60788]: _type = "Task" [ 1707.818400] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.825816] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Task: {'id': task-2205274, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1708.107199] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1708.107542] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating directory with path [datastore2] vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1708.107649] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d4d753b7-12ad-4e99-ada9-8239cd5a627f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.118071] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created directory with path [datastore2] vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1708.118268] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Fetch image to [datastore2] vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1708.118420] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1708.119117] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-414515e7-3ec1-4a7b-9d4c-17b1605cb285 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.125488] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f4d3d48-e2c8-4625-b2c8-e5cbb4221d1b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.134219] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0115f20-5735-4668-b813-7bf700363fd2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.164057] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc1be4f4-26d4-43d5-9d0e-1626f227ceee {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.169839] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-98259716-bef1-4394-b475-56c0d5505bfb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.188327] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1708.236988] env[60788]: DEBUG oslo_vmware.rw_handles [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1708.295788] env[60788]: DEBUG oslo_vmware.rw_handles [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1708.296054] env[60788]: DEBUG oslo_vmware.rw_handles [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1708.328231] env[60788]: DEBUG oslo_vmware.api [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Task: {'id': task-2205274, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.029666} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1708.328469] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1708.328653] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1708.328826] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1708.328999] env[60788]: INFO nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1708.329252] env[60788]: DEBUG oslo.service.loopingcall [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1708.329467] env[60788]: DEBUG nova.compute.manager [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1708.331578] env[60788]: DEBUG nova.compute.claims [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1708.331779] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.332007] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.502142] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56ea8353-fe19-4322-8eaa-2e089cc278f3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.510048] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e99d68c1-4600-4c0d-93a6-6e8dcde10eca {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.539607] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c1634c-e2ff-460a-a8ed-cff61d7d88d5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.546108] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-807b1bdc-3e89-42f7-928d-74ce3ab3be68 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.558642] env[60788]: DEBUG nova.compute.provider_tree [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1708.567525] env[60788]: DEBUG nova.scheduler.client.report [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1708.580759] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.249s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.581300] env[60788]: ERROR nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1708.581300] env[60788]: Faults: ['InvalidArgument'] [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] Traceback (most recent call last): [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self.driver.spawn(context, instance, image_meta, [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self._fetch_image_if_missing(context, vi) [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] image_cache(vi, tmp_image_ds_loc) [ 1708.581300] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] vm_util.copy_virtual_disk( [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] session._wait_for_task(vmdk_copy_task) [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] return self.wait_for_task(task_ref) [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] return evt.wait() [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] result = hub.switch() [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] return self.greenlet.switch() [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1708.581677] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] self.f(*self.args, **self.kw) [ 1708.582023] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1708.582023] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] raise exceptions.translate_fault(task_info.error) [ 1708.582023] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1708.582023] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] Faults: ['InvalidArgument'] [ 1708.582023] env[60788]: ERROR nova.compute.manager [instance: d63f9834-818b-4087-851c-d7394d20b89d] [ 1708.582023] env[60788]: DEBUG nova.compute.utils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1708.583291] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Build of instance d63f9834-818b-4087-851c-d7394d20b89d was re-scheduled: A specified parameter was not correct: fileType [ 1708.583291] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1708.583673] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1708.583889] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1708.584049] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquired lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1708.584209] env[60788]: DEBUG nova.network.neutron [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1708.606679] env[60788]: DEBUG nova.network.neutron [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1708.664971] env[60788]: DEBUG nova.network.neutron [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1708.675291] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Releasing lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1708.675514] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1708.675726] env[60788]: DEBUG nova.compute.manager [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Skipping network deallocation for instance since networking was not requested. {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2260}} [ 1708.761047] env[60788]: INFO nova.scheduler.client.report [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Deleted allocations for instance d63f9834-818b-4087-851c-d7394d20b89d [ 1708.787257] env[60788]: DEBUG oslo_concurrency.lockutils [None req-8dea45c7-f72c-4d59-9804-e27b9b8b7546 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "d63f9834-818b-4087-851c-d7394d20b89d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 588.659s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.789986] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "d63f9834-818b-4087-851c-d7394d20b89d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 392.634s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.789986] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "d63f9834-818b-4087-851c-d7394d20b89d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.789986] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "d63f9834-818b-4087-851c-d7394d20b89d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.790218] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "d63f9834-818b-4087-851c-d7394d20b89d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.791680] env[60788]: INFO nova.compute.manager [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Terminating instance [ 1708.793372] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquiring lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1708.793663] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Acquired lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1708.793983] env[60788]: DEBUG nova.network.neutron [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1708.807731] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1708.822490] env[60788]: DEBUG nova.network.neutron [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1708.863049] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.863049] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.863049] env[60788]: INFO nova.compute.claims [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1709.025362] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbbb538a-32ef-404a-ae4d-d5af616472ca {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.033394] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d76ec68-2601-4362-ac01-e3c73657e7df {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.063687] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c4e8b3d-3583-49cd-886c-481a3b13c06f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.071230] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-942d00ac-ff6c-4f08-aa3d-9f18f6975e4a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.085278] env[60788]: DEBUG nova.compute.provider_tree [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1709.096452] env[60788]: DEBUG nova.scheduler.client.report [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1709.109478] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.112019] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1709.132737] env[60788]: DEBUG nova.network.neutron [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1709.139650] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Releasing lock "refresh_cache-d63f9834-818b-4087-851c-d7394d20b89d" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1709.140201] env[60788]: DEBUG nova.compute.manager [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1709.141510] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1709.141510] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c2dbc279-14ac-4f7f-835b-1030eecc4167 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.146026] env[60788]: DEBUG nova.compute.utils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1709.148030] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1709.148187] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1709.155377] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cebb4cc6-2579-49b3-a3aa-9c6af511daa9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.166478] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1709.186235] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d63f9834-818b-4087-851c-d7394d20b89d could not be found. [ 1709.186429] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1709.186622] env[60788]: INFO nova.compute.manager [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1709.186886] env[60788]: DEBUG oslo.service.loopingcall [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1709.187124] env[60788]: DEBUG nova.compute.manager [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1709.187222] env[60788]: DEBUG nova.network.neutron [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1709.212485] env[60788]: DEBUG nova.network.neutron [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1709.220226] env[60788]: DEBUG nova.network.neutron [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1709.228364] env[60788]: DEBUG nova.policy [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34d238f3928b4f40813646c9867375c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a80b1c30e829410c9a324f5a4af8c9f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1709.232667] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1709.235018] env[60788]: INFO nova.compute.manager [-] [instance: d63f9834-818b-4087-851c-d7394d20b89d] Took 0.05 seconds to deallocate network for instance. [ 1709.252621] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1709.252858] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1709.253024] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1709.253211] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1709.253359] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1709.253503] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1709.253739] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1709.253909] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1709.254090] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1709.254256] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1709.254424] env[60788]: DEBUG nova.virt.hardware [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1709.255498] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ea53cb1-ca65-4855-b6c4-b769f7fb505f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.265309] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d45aa92-59f6-4ae0-a18e-b2ecef8af277 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.325363] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cb8210ce-a68f-42e9-8a72-e0f9052a0d87 tempest-ServersAaction247Test-474212661 tempest-ServersAaction247Test-474212661-project-member] Lock "d63f9834-818b-4087-851c-d7394d20b89d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.537s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.326267] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "d63f9834-818b-4087-851c-d7394d20b89d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 26.576s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1709.326451] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: d63f9834-818b-4087-851c-d7394d20b89d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1709.326629] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "d63f9834-818b-4087-851c-d7394d20b89d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.632794] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Successfully created port: 6e7ca9ba-374a-4747-a3cf-c8a65542a34d {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1710.331772] env[60788]: DEBUG nova.compute.manager [req-af50da72-c8a4-4673-8441-a2ab0e59fcfb req-391f4275-d052-4d41-acf1-c0de01c81b4b service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Received event network-vif-plugged-6e7ca9ba-374a-4747-a3cf-c8a65542a34d {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1710.332016] env[60788]: DEBUG oslo_concurrency.lockutils [req-af50da72-c8a4-4673-8441-a2ab0e59fcfb req-391f4275-d052-4d41-acf1-c0de01c81b4b service nova] Acquiring lock "4864273c-b505-4e31-bf7b-633ba1e99562-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1710.332287] env[60788]: DEBUG oslo_concurrency.lockutils [req-af50da72-c8a4-4673-8441-a2ab0e59fcfb req-391f4275-d052-4d41-acf1-c0de01c81b4b service nova] Lock "4864273c-b505-4e31-bf7b-633ba1e99562-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1710.332461] env[60788]: DEBUG oslo_concurrency.lockutils [req-af50da72-c8a4-4673-8441-a2ab0e59fcfb req-391f4275-d052-4d41-acf1-c0de01c81b4b service nova] Lock "4864273c-b505-4e31-bf7b-633ba1e99562-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1710.332628] env[60788]: DEBUG nova.compute.manager [req-af50da72-c8a4-4673-8441-a2ab0e59fcfb req-391f4275-d052-4d41-acf1-c0de01c81b4b service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] No waiting events found dispatching network-vif-plugged-6e7ca9ba-374a-4747-a3cf-c8a65542a34d {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1710.332792] env[60788]: WARNING nova.compute.manager [req-af50da72-c8a4-4673-8441-a2ab0e59fcfb req-391f4275-d052-4d41-acf1-c0de01c81b4b service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Received unexpected event network-vif-plugged-6e7ca9ba-374a-4747-a3cf-c8a65542a34d for instance with vm_state building and task_state spawning. [ 1710.373846] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Successfully updated port: 6e7ca9ba-374a-4747-a3cf-c8a65542a34d {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1710.385282] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "refresh_cache-4864273c-b505-4e31-bf7b-633ba1e99562" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1710.385425] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "refresh_cache-4864273c-b505-4e31-bf7b-633ba1e99562" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1710.385573] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1710.425791] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1710.583943] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Updating instance_info_cache with network_info: [{"id": "6e7ca9ba-374a-4747-a3cf-c8a65542a34d", "address": "fa:16:3e:9f:ea:0d", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e7ca9ba-37", "ovs_interfaceid": "6e7ca9ba-374a-4747-a3cf-c8a65542a34d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1710.596685] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "refresh_cache-4864273c-b505-4e31-bf7b-633ba1e99562" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1710.597083] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance network_info: |[{"id": "6e7ca9ba-374a-4747-a3cf-c8a65542a34d", "address": "fa:16:3e:9f:ea:0d", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e7ca9ba-37", "ovs_interfaceid": "6e7ca9ba-374a-4747-a3cf-c8a65542a34d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1710.597479] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9f:ea:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a4b6ddb2-2e19-4031-9b22-add90d41a114', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6e7ca9ba-374a-4747-a3cf-c8a65542a34d', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1710.605240] env[60788]: DEBUG oslo.service.loopingcall [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1710.605752] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1710.605988] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cfe5a73c-df17-4545-a4a8-21b7777caaba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1710.626922] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1710.626922] env[60788]: value = "task-2205275" [ 1710.626922] env[60788]: _type = "Task" [ 1710.626922] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1710.636203] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205275, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.138177] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205275, 'name': CreateVM_Task, 'duration_secs': 0.278187} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1711.138345] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1711.139033] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1711.139209] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1711.139524] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1711.139782] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3df1a11-2e77-45fb-8005-94216f28acc7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.144299] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 1711.144299] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52b8b34d-1a1b-e656-bf77-8f8a26b02763" [ 1711.144299] env[60788]: _type = "Task" [ 1711.144299] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1711.157561] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52b8b34d-1a1b-e656-bf77-8f8a26b02763, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.654538] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1711.654919] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1711.655012] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1712.363206] env[60788]: DEBUG nova.compute.manager [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Received event network-changed-6e7ca9ba-374a-4747-a3cf-c8a65542a34d {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1712.363409] env[60788]: DEBUG nova.compute.manager [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Refreshing instance network info cache due to event network-changed-6e7ca9ba-374a-4747-a3cf-c8a65542a34d. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1712.363652] env[60788]: DEBUG oslo_concurrency.lockutils [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] Acquiring lock "refresh_cache-4864273c-b505-4e31-bf7b-633ba1e99562" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1712.363872] env[60788]: DEBUG oslo_concurrency.lockutils [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] Acquired lock "refresh_cache-4864273c-b505-4e31-bf7b-633ba1e99562" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1712.363973] env[60788]: DEBUG nova.network.neutron [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Refreshing network info cache for port 6e7ca9ba-374a-4747-a3cf-c8a65542a34d {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1712.689125] env[60788]: DEBUG nova.network.neutron [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Updated VIF entry in instance network info cache for port 6e7ca9ba-374a-4747-a3cf-c8a65542a34d. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1712.689494] env[60788]: DEBUG nova.network.neutron [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Updating instance_info_cache with network_info: [{"id": "6e7ca9ba-374a-4747-a3cf-c8a65542a34d", "address": "fa:16:3e:9f:ea:0d", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e7ca9ba-37", "ovs_interfaceid": "6e7ca9ba-374a-4747-a3cf-c8a65542a34d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1712.698260] env[60788]: DEBUG oslo_concurrency.lockutils [req-ec79a340-841c-41fa-82e1-c363f2aee719 req-9192d891-dec7-45bd-a85a-9db9e6eba749 service nova] Releasing lock "refresh_cache-4864273c-b505-4e31-bf7b-633ba1e99562" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1718.545236] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1719.340217] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "db89c7e8-6d81-4c0a-9111-9f6256588967" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1719.340440] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1724.431742] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1743.762923] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1745.755762] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1747.754217] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1748.755067] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1748.755435] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1748.755574] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1749.754935] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1749.755149] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1749.755384] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1749.776958] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.777265] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.777422] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.777561] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.777768] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.777922] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.778060] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.778185] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.778306] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.778425] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1749.778546] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1750.753500] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1750.753716] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1750.766412] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.766751] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1750.766832] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1750.767044] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1750.768199] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c004a948-9b3c-4a99-8b01-0ef86e20b8d2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.776958] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9e69238-fa92-4789-bf85-9fafc327ee9f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.790555] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-406c828d-7ea2-4ac6-b75c-8f6d371eca04 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.796448] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bb66f00-0440-498a-b51b-2e44f0bd5efa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1750.824997] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181255MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1750.825149] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1750.825332] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1750.894814] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 688ff077-9505-48f5-9117-0a7f115f254c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895036] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 58bbe972-5fc1-4627-90e4-91251e047e86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895228] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895384] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895534] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895662] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895778] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.895890] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.896009] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.896134] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1750.906500] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1750.906721] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1750.906867] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1751.027247] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caf661f5-46cc-4386-9819-87f044eee5f6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.034490] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3b8cf83-a1d7-4c39-a0e6-cff44f694594 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.064480] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac25845-3555-4775-a540-7a8a65cba2f0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.071194] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec51d7b2-2988-4b1b-8d97-251ca437aec5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1751.083755] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1751.092128] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1751.108124] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1751.108306] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1752.108594] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1757.586894] env[60788]: WARNING oslo_vmware.rw_handles [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1757.586894] env[60788]: ERROR oslo_vmware.rw_handles [ 1757.587674] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1757.589223] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1757.589458] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Copying Virtual Disk [datastore2] vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/e8c3c425-92a1-46c3-9c37-44ef8c4a17c6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1757.589750] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e5c126f2-544a-47c6-9e92-05876bcf10ac {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.598320] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1757.598320] env[60788]: value = "task-2205276" [ 1757.598320] env[60788]: _type = "Task" [ 1757.598320] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1757.605754] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205276, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1758.109373] env[60788]: DEBUG oslo_vmware.exceptions [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1758.109653] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1758.110245] env[60788]: ERROR nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1758.110245] env[60788]: Faults: ['InvalidArgument'] [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Traceback (most recent call last): [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] yield resources [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self.driver.spawn(context, instance, image_meta, [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self._fetch_image_if_missing(context, vi) [ 1758.110245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] image_cache(vi, tmp_image_ds_loc) [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] vm_util.copy_virtual_disk( [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] session._wait_for_task(vmdk_copy_task) [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] return self.wait_for_task(task_ref) [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] return evt.wait() [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] result = hub.switch() [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1758.110618] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] return self.greenlet.switch() [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self.f(*self.args, **self.kw) [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] raise exceptions.translate_fault(task_info.error) [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Faults: ['InvalidArgument'] [ 1758.111043] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] [ 1758.111043] env[60788]: INFO nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Terminating instance [ 1758.112104] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1758.112315] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1758.112557] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aa99ba88-1398-4cf5-ba16-30094dec5b67 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.114802] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1758.115026] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1758.115736] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcee94f2-a02a-4722-993c-d689b0ff46fb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.123818] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1758.124044] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bae4c69a-5397-4f79-8ba4-4796b7218b4f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.126230] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1758.126403] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1758.127421] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2a666047-157e-4ba7-892e-888f3033009c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.132211] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Waiting for the task: (returnval){ [ 1758.132211] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525ea83f-a7b6-5410-ebc9-5ff7bcc3cc7b" [ 1758.132211] env[60788]: _type = "Task" [ 1758.132211] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1758.139310] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]525ea83f-a7b6-5410-ebc9-5ff7bcc3cc7b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1758.199745] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1758.199745] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1758.199745] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleting the datastore file [datastore2] 688ff077-9505-48f5-9117-0a7f115f254c {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1758.199745] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e77fae6e-792a-43f0-b3a2-3bd3c1836f66 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.206250] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1758.206250] env[60788]: value = "task-2205278" [ 1758.206250] env[60788]: _type = "Task" [ 1758.206250] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1758.213645] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205278, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1758.643636] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1758.643636] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Creating directory with path [datastore2] vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1758.643636] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-971aa99e-e2bd-4353-992c-4440e5580c51 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.654242] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Created directory with path [datastore2] vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1758.654435] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Fetch image to [datastore2] vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1758.654619] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1758.655377] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a933ae7-2973-4b44-a31e-26f44dc486e9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.661780] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3a5f87-a530-4380-bd2c-b44a87136f65 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.670427] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fe3dc5a-1eda-461d-b3a9-8adfc280f90f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.700475] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c10f33-416e-4f68-ae8e-094b941d2978 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.705596] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e66a8384-50ef-4ab8-b558-cccc0a2daf3e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.714583] env[60788]: DEBUG oslo_vmware.api [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205278, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079154} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1758.714805] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1758.714981] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1758.715167] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1758.715337] env[60788]: INFO nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1758.717447] env[60788]: DEBUG nova.compute.claims [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1758.717619] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1758.717865] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1758.730853] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1758.779155] env[60788]: DEBUG oslo_vmware.rw_handles [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1758.838029] env[60788]: DEBUG oslo_vmware.rw_handles [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1758.838220] env[60788]: DEBUG oslo_vmware.rw_handles [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1758.936774] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-617c1e8f-1191-4066-b3c9-13b139d424fd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.943937] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa1281a-c03a-4493-ad44-62cdcabc1f89 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.974435] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a406b1bc-4082-488d-a771-edbb0fc9dd65 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.981012] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12450c40-0d0c-404a-884d-50c27175c603 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.993481] env[60788]: DEBUG nova.compute.provider_tree [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1759.001921] env[60788]: DEBUG nova.scheduler.client.report [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1759.014930] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.297s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.015459] env[60788]: ERROR nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1759.015459] env[60788]: Faults: ['InvalidArgument'] [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Traceback (most recent call last): [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self.driver.spawn(context, instance, image_meta, [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self._fetch_image_if_missing(context, vi) [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] image_cache(vi, tmp_image_ds_loc) [ 1759.015459] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] vm_util.copy_virtual_disk( [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] session._wait_for_task(vmdk_copy_task) [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] return self.wait_for_task(task_ref) [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] return evt.wait() [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] result = hub.switch() [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] return self.greenlet.switch() [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1759.015897] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] self.f(*self.args, **self.kw) [ 1759.016245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1759.016245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] raise exceptions.translate_fault(task_info.error) [ 1759.016245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1759.016245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Faults: ['InvalidArgument'] [ 1759.016245] env[60788]: ERROR nova.compute.manager [instance: 688ff077-9505-48f5-9117-0a7f115f254c] [ 1759.016245] env[60788]: DEBUG nova.compute.utils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1759.017662] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Build of instance 688ff077-9505-48f5-9117-0a7f115f254c was re-scheduled: A specified parameter was not correct: fileType [ 1759.017662] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1759.018399] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1759.018399] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1759.018520] env[60788]: DEBUG nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1759.018599] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1759.288380] env[60788]: DEBUG nova.network.neutron [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1759.298184] env[60788]: INFO nova.compute.manager [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Took 0.28 seconds to deallocate network for instance. [ 1759.396227] env[60788]: INFO nova.scheduler.client.report [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleted allocations for instance 688ff077-9505-48f5-9117-0a7f115f254c [ 1759.419509] env[60788]: DEBUG oslo_concurrency.lockutils [None req-b5eb69c4-21e7-49e9-afaa-c8811f1cce05 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "688ff077-9505-48f5-9117-0a7f115f254c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.030s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.422182] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "688ff077-9505-48f5-9117-0a7f115f254c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 427.444s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1759.422474] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "688ff077-9505-48f5-9117-0a7f115f254c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1759.422739] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "688ff077-9505-48f5-9117-0a7f115f254c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1759.422942] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "688ff077-9505-48f5-9117-0a7f115f254c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.427221] env[60788]: INFO nova.compute.manager [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Terminating instance [ 1759.430673] env[60788]: DEBUG nova.compute.manager [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1759.431134] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1759.431678] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2bb4c449-a23f-470e-a208-837b0cf68357 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.436164] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1759.442983] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92368726-9c73-4ce7-8e5f-12e1f7857794 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.476096] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 688ff077-9505-48f5-9117-0a7f115f254c could not be found. [ 1759.476361] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1759.476583] env[60788]: INFO nova.compute.manager [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1759.476894] env[60788]: DEBUG oslo.service.loopingcall [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1759.479387] env[60788]: DEBUG nova.compute.manager [-] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1759.479528] env[60788]: DEBUG nova.network.neutron [-] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1759.493247] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1759.493525] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1759.494974] env[60788]: INFO nova.compute.claims [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1759.602599] env[60788]: DEBUG nova.network.neutron [-] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1759.611477] env[60788]: INFO nova.compute.manager [-] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] Took 0.13 seconds to deallocate network for instance. [ 1759.686574] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc25af86-91a2-4315-8a12-589135006a97 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.694397] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e501944-9d65-4093-89d2-a09555bcab6b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.727397] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07c633a5-4342-48a6-bac3-ce714b573622 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.730202] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5f410949-e169-4020-beb6-88085d2599b8 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "688ff077-9505-48f5-9117-0a7f115f254c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.309s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.731365] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "688ff077-9505-48f5-9117-0a7f115f254c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 76.980s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1759.731610] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 688ff077-9505-48f5-9117-0a7f115f254c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1759.731836] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "688ff077-9505-48f5-9117-0a7f115f254c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.737906] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec65814a-d43e-4280-ae59-9b879a4bd973 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.756277] env[60788]: DEBUG nova.compute.provider_tree [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1759.767151] env[60788]: DEBUG nova.scheduler.client.report [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1759.785470] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.786060] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1759.828955] env[60788]: DEBUG nova.compute.utils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1759.830669] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1759.830970] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1759.874591] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1759.919538] env[60788]: DEBUG nova.policy [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9608a7d578f54e3aa974e37153821d4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '936e92b1754a415b9b9d7cff62af1e2b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1759.940842] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1759.968190] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1759.968432] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1759.968587] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1759.968768] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1759.968917] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1759.969197] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1759.969502] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1759.969726] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1759.969959] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1759.970194] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1759.970434] env[60788]: DEBUG nova.virt.hardware [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1759.971354] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8240a76-2e53-49d0-a0f4-6e81a12393cd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.979294] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f98a78-891a-4d6a-bf62-f5d6c3b65ad9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1760.219418] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Successfully created port: 88451966-489a-4d75-bc08-83b0abf50489 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1760.883988] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Successfully updated port: 88451966-489a-4d75-bc08-83b0abf50489 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1760.899440] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "refresh_cache-db89c7e8-6d81-4c0a-9111-9f6256588967" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1760.899626] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "refresh_cache-db89c7e8-6d81-4c0a-9111-9f6256588967" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1760.899785] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1760.981415] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1761.315565] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Updating instance_info_cache with network_info: [{"id": "88451966-489a-4d75-bc08-83b0abf50489", "address": "fa:16:3e:f1:02:94", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88451966-48", "ovs_interfaceid": "88451966-489a-4d75-bc08-83b0abf50489", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1761.326776] env[60788]: DEBUG nova.compute.manager [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Received event network-vif-plugged-88451966-489a-4d75-bc08-83b0abf50489 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1761.327928] env[60788]: DEBUG oslo_concurrency.lockutils [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] Acquiring lock "db89c7e8-6d81-4c0a-9111-9f6256588967-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1761.327928] env[60788]: DEBUG oslo_concurrency.lockutils [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1761.327928] env[60788]: DEBUG oslo_concurrency.lockutils [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1761.327928] env[60788]: DEBUG nova.compute.manager [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] No waiting events found dispatching network-vif-plugged-88451966-489a-4d75-bc08-83b0abf50489 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1761.328356] env[60788]: WARNING nova.compute.manager [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Received unexpected event network-vif-plugged-88451966-489a-4d75-bc08-83b0abf50489 for instance with vm_state building and task_state spawning. [ 1761.328356] env[60788]: DEBUG nova.compute.manager [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Received event network-changed-88451966-489a-4d75-bc08-83b0abf50489 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1761.328356] env[60788]: DEBUG nova.compute.manager [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Refreshing instance network info cache due to event network-changed-88451966-489a-4d75-bc08-83b0abf50489. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1761.328356] env[60788]: DEBUG oslo_concurrency.lockutils [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] Acquiring lock "refresh_cache-db89c7e8-6d81-4c0a-9111-9f6256588967" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1761.330491] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "refresh_cache-db89c7e8-6d81-4c0a-9111-9f6256588967" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1761.330777] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance network_info: |[{"id": "88451966-489a-4d75-bc08-83b0abf50489", "address": "fa:16:3e:f1:02:94", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88451966-48", "ovs_interfaceid": "88451966-489a-4d75-bc08-83b0abf50489", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1761.331041] env[60788]: DEBUG oslo_concurrency.lockutils [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] Acquired lock "refresh_cache-db89c7e8-6d81-4c0a-9111-9f6256588967" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1761.331215] env[60788]: DEBUG nova.network.neutron [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Refreshing network info cache for port 88451966-489a-4d75-bc08-83b0abf50489 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1761.332732] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f1:02:94', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed3ffc1d-9f86-4029-857e-6cd1d383edbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '88451966-489a-4d75-bc08-83b0abf50489', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1761.340721] env[60788]: DEBUG oslo.service.loopingcall [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1761.343628] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1761.344073] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fb40f473-bc36-40b1-a9c3-b559de291849 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.363898] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1761.363898] env[60788]: value = "task-2205279" [ 1761.363898] env[60788]: _type = "Task" [ 1761.363898] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1761.371232] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205279, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1761.594774] env[60788]: DEBUG nova.network.neutron [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Updated VIF entry in instance network info cache for port 88451966-489a-4d75-bc08-83b0abf50489. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1761.595175] env[60788]: DEBUG nova.network.neutron [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Updating instance_info_cache with network_info: [{"id": "88451966-489a-4d75-bc08-83b0abf50489", "address": "fa:16:3e:f1:02:94", "network": {"id": "a04fff0e-83a2-4524-9af1-57a336af8a31", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1653226873-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "936e92b1754a415b9b9d7cff62af1e2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88451966-48", "ovs_interfaceid": "88451966-489a-4d75-bc08-83b0abf50489", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1761.604678] env[60788]: DEBUG oslo_concurrency.lockutils [req-7d78c383-e667-4453-b56e-4d46ec1fb98d req-04121e91-2143-48c2-b996-b9ecf41cffa5 service nova] Releasing lock "refresh_cache-db89c7e8-6d81-4c0a-9111-9f6256588967" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1761.874980] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205279, 'name': CreateVM_Task, 'duration_secs': 0.26444} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1761.875254] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1761.875896] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1761.876925] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1761.876925] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1761.876925] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3f36324c-310b-4668-8374-2a7f0dded787 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1761.881017] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1761.881017] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5218026f-272f-a223-2003-9cec6a0c0ac7" [ 1761.881017] env[60788]: _type = "Task" [ 1761.881017] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1761.888418] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5218026f-272f-a223-2003-9cec6a0c0ac7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1762.391189] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1762.391466] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1762.391748] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1773.162978] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.163396] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1793.881502] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "6df14da6-6e82-4573-8dc3-27f8349e586f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1793.881868] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "6df14da6-6e82-4573-8dc3-27f8349e586f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1800.750312] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1804.837307] env[60788]: WARNING oslo_vmware.rw_handles [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1804.837307] env[60788]: ERROR oslo_vmware.rw_handles [ 1804.838018] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1804.839712] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1804.839948] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Copying Virtual Disk [datastore2] vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/8f032187-c8af-491a-8a03-8b4e284fb547/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1804.840247] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-af167369-9ee4-41ce-988f-85d323ce602f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1804.848397] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Waiting for the task: (returnval){ [ 1804.848397] env[60788]: value = "task-2205280" [ 1804.848397] env[60788]: _type = "Task" [ 1804.848397] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1804.856389] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Task: {'id': task-2205280, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1805.358991] env[60788]: DEBUG oslo_vmware.exceptions [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1805.359269] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1805.359827] env[60788]: ERROR nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1805.359827] env[60788]: Faults: ['InvalidArgument'] [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Traceback (most recent call last): [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] yield resources [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self.driver.spawn(context, instance, image_meta, [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self._fetch_image_if_missing(context, vi) [ 1805.359827] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] image_cache(vi, tmp_image_ds_loc) [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] vm_util.copy_virtual_disk( [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] session._wait_for_task(vmdk_copy_task) [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] return self.wait_for_task(task_ref) [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] return evt.wait() [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] result = hub.switch() [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1805.360273] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] return self.greenlet.switch() [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self.f(*self.args, **self.kw) [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] raise exceptions.translate_fault(task_info.error) [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Faults: ['InvalidArgument'] [ 1805.360669] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] [ 1805.360669] env[60788]: INFO nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Terminating instance [ 1805.361726] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1805.361935] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1805.363017] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1805.363222] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1805.363451] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e0860ea2-51d1-413f-917e-ec662b581cb0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.365692] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31570e15-5227-4644-9f56-3dac72d8c4e8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.373679] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1805.373884] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-413b44f8-52f3-4950-a8b0-f60907985010 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.376011] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1805.376195] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1805.377187] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e8d4a31e-d91b-4dee-815b-977590d67ea6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.381777] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Waiting for the task: (returnval){ [ 1805.381777] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5277e49d-a7d5-5f20-da3f-8c5e3dba3342" [ 1805.381777] env[60788]: _type = "Task" [ 1805.381777] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1805.389025] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5277e49d-a7d5-5f20-da3f-8c5e3dba3342, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1805.445642] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1805.445888] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1805.446112] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Deleting the datastore file [datastore2] 58bbe972-5fc1-4627-90e4-91251e047e86 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1805.446413] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-177e8987-aa13-4275-b95c-6da5e0730940 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.452980] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Waiting for the task: (returnval){ [ 1805.452980] env[60788]: value = "task-2205282" [ 1805.452980] env[60788]: _type = "Task" [ 1805.452980] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1805.461264] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Task: {'id': task-2205282, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1805.754220] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1805.891968] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1805.892239] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Creating directory with path [datastore2] vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1805.892462] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-93da7ed7-9114-4d90-b05f-b931502bb933 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.903570] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Created directory with path [datastore2] vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1805.903752] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Fetch image to [datastore2] vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1805.903921] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1805.904685] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e5fdf4f-9ae9-4e82-b06b-ad845f045c97 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.910868] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0caa91-dd82-44c9-9515-6fd7cd4c7962 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.919731] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf496add-e2f4-490c-b0d6-1b80305b46e6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.949721] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-837f18d6-cec9-4dc9-9464-71752a8eb178 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.958369] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f78ea6db-1793-4e38-9003-15396bfac5c7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1805.962582] env[60788]: DEBUG oslo_vmware.api [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Task: {'id': task-2205282, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.125184} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1805.963179] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1805.963384] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1805.963575] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1805.963749] env[60788]: INFO nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1805.965932] env[60788]: DEBUG nova.compute.claims [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1805.966135] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1805.966369] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1805.983218] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1806.155518] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e86118c5-b989-4d6d-a347-1e4a9289d56b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.163397] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be052ea0-4f0e-4d1d-bc04-a1c0f80429a5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.195276] env[60788]: DEBUG oslo_vmware.rw_handles [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1806.197149] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac6a227-7452-42df-8115-22dd47938323 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.259991] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbb82a77-0e9d-45a0-925e-926b3185fd69 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.264356] env[60788]: DEBUG oslo_vmware.rw_handles [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1806.264518] env[60788]: DEBUG oslo_vmware.rw_handles [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1806.275723] env[60788]: DEBUG nova.compute.provider_tree [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1806.284464] env[60788]: DEBUG nova.scheduler.client.report [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1806.300605] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.334s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.301146] env[60788]: ERROR nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1806.301146] env[60788]: Faults: ['InvalidArgument'] [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Traceback (most recent call last): [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self.driver.spawn(context, instance, image_meta, [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self._fetch_image_if_missing(context, vi) [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] image_cache(vi, tmp_image_ds_loc) [ 1806.301146] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] vm_util.copy_virtual_disk( [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] session._wait_for_task(vmdk_copy_task) [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] return self.wait_for_task(task_ref) [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] return evt.wait() [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] result = hub.switch() [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] return self.greenlet.switch() [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1806.301552] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] self.f(*self.args, **self.kw) [ 1806.301943] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1806.301943] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] raise exceptions.translate_fault(task_info.error) [ 1806.301943] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1806.301943] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Faults: ['InvalidArgument'] [ 1806.301943] env[60788]: ERROR nova.compute.manager [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] [ 1806.301943] env[60788]: DEBUG nova.compute.utils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1806.303315] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Build of instance 58bbe972-5fc1-4627-90e4-91251e047e86 was re-scheduled: A specified parameter was not correct: fileType [ 1806.303315] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1806.303677] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1806.303844] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1806.304020] env[60788]: DEBUG nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1806.304293] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1806.761899] env[60788]: DEBUG nova.network.neutron [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1806.773677] env[60788]: INFO nova.compute.manager [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Took 0.47 seconds to deallocate network for instance. [ 1806.877929] env[60788]: INFO nova.scheduler.client.report [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Deleted allocations for instance 58bbe972-5fc1-4627-90e4-91251e047e86 [ 1806.904516] env[60788]: DEBUG oslo_concurrency.lockutils [None req-7dbdc54c-7f75-4c8c-8f11-8d3c02b10207 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "58bbe972-5fc1-4627-90e4-91251e047e86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.491s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.905723] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "58bbe972-5fc1-4627-90e4-91251e047e86" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.924s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.905960] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "58bbe972-5fc1-4627-90e4-91251e047e86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.906190] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "58bbe972-5fc1-4627-90e4-91251e047e86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.906360] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "58bbe972-5fc1-4627-90e4-91251e047e86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1806.908418] env[60788]: INFO nova.compute.manager [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Terminating instance [ 1806.910212] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquiring lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1806.910385] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Acquired lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1806.910552] env[60788]: DEBUG nova.network.neutron [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1806.915479] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1806.969769] env[60788]: DEBUG nova.network.neutron [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1806.976025] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1806.976280] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1806.977765] env[60788]: INFO nova.compute.claims [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1807.077606] env[60788]: DEBUG nova.network.neutron [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1807.086692] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Releasing lock "refresh_cache-58bbe972-5fc1-4627-90e4-91251e047e86" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1807.087114] env[60788]: DEBUG nova.compute.manager [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1807.087312] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1807.087848] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-afaf78ce-c004-42c6-aa49-c94948f2105d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.097621] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9af14542-1cda-40a3-97af-70508af26ee4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.128322] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 58bbe972-5fc1-4627-90e4-91251e047e86 could not be found. [ 1807.128508] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1807.128682] env[60788]: INFO nova.compute.manager [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1807.128952] env[60788]: DEBUG oslo.service.loopingcall [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1807.131111] env[60788]: DEBUG nova.compute.manager [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1807.131216] env[60788]: DEBUG nova.network.neutron [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1807.152097] env[60788]: DEBUG nova.network.neutron [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1807.160362] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcc4d800-f80a-4ba6-a9bd-81f8f2a431ba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.163401] env[60788]: DEBUG nova.network.neutron [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1807.169222] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80850684-bb3b-4756-b9ee-0ea068b270b2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.172770] env[60788]: INFO nova.compute.manager [-] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] Took 0.04 seconds to deallocate network for instance. [ 1807.201944] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b03eb80b-c878-45c8-99a9-96c6a4fc0c09 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.212043] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c271e58e-612e-44dc-8ae3-239700d62015 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.229098] env[60788]: DEBUG nova.compute.provider_tree [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1807.236672] env[60788]: DEBUG nova.scheduler.client.report [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1807.255118] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1807.255599] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1807.266728] env[60788]: DEBUG oslo_concurrency.lockutils [None req-a46a5254-b99e-4a89-ab97-c8d8dcc70b38 tempest-ServersTestManualDisk-522662310 tempest-ServersTestManualDisk-522662310-project-member] Lock "58bbe972-5fc1-4627-90e4-91251e047e86" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.361s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1807.267518] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "58bbe972-5fc1-4627-90e4-91251e047e86" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 124.516s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1807.267701] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 58bbe972-5fc1-4627-90e4-91251e047e86] During sync_power_state the instance has a pending task (deleting). Skip. [ 1807.267895] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "58bbe972-5fc1-4627-90e4-91251e047e86" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1807.287427] env[60788]: DEBUG nova.compute.utils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1807.288648] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1807.288894] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1807.296231] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1807.344735] env[60788]: DEBUG nova.policy [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '571aaecebbc249e3ae4d9306e1e109ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e80c355190594f5a960ca2d14c3f010c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1807.359145] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1807.384832] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1807.385088] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1807.385249] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1807.385431] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1807.385577] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1807.385722] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1807.385959] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1807.386150] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1807.386316] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1807.386478] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1807.386652] env[60788]: DEBUG nova.virt.hardware [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1807.387520] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f028f9f9-a08c-4893-8f95-bf250c939bfe {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.395815] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eddc18f-34fa-4cab-9c48-2ef8add31b63 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.643233] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Successfully created port: 5853dba2-8f48-4d5c-a1bc-de01dca51ae1 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1807.754179] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1808.644864] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Successfully updated port: 5853dba2-8f48-4d5c-a1bc-de01dca51ae1 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1808.654766] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "refresh_cache-dbf41f65-ac34-4da6-837d-9d4e924fcf7c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1808.654908] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "refresh_cache-dbf41f65-ac34-4da6-837d-9d4e924fcf7c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1808.655061] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1808.692756] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1808.847337] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Updating instance_info_cache with network_info: [{"id": "5853dba2-8f48-4d5c-a1bc-de01dca51ae1", "address": "fa:16:3e:94:e9:dc", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5853dba2-8f", "ovs_interfaceid": "5853dba2-8f48-4d5c-a1bc-de01dca51ae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1808.862312] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "refresh_cache-dbf41f65-ac34-4da6-837d-9d4e924fcf7c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1808.862611] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance network_info: |[{"id": "5853dba2-8f48-4d5c-a1bc-de01dca51ae1", "address": "fa:16:3e:94:e9:dc", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5853dba2-8f", "ovs_interfaceid": "5853dba2-8f48-4d5c-a1bc-de01dca51ae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1808.862997] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:e9:dc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f4a795c-8718-4a7c-aafe-9da231df10f8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5853dba2-8f48-4d5c-a1bc-de01dca51ae1', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1808.870621] env[60788]: DEBUG oslo.service.loopingcall [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1808.871131] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1808.871387] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-280bd852-f982-4939-918b-c27ee74b730f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.887934] env[60788]: DEBUG nova.compute.manager [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Received event network-vif-plugged-5853dba2-8f48-4d5c-a1bc-de01dca51ae1 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1808.888123] env[60788]: DEBUG oslo_concurrency.lockutils [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] Acquiring lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1808.888335] env[60788]: DEBUG oslo_concurrency.lockutils [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1808.888503] env[60788]: DEBUG oslo_concurrency.lockutils [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1808.888664] env[60788]: DEBUG nova.compute.manager [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] No waiting events found dispatching network-vif-plugged-5853dba2-8f48-4d5c-a1bc-de01dca51ae1 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1808.888878] env[60788]: WARNING nova.compute.manager [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Received unexpected event network-vif-plugged-5853dba2-8f48-4d5c-a1bc-de01dca51ae1 for instance with vm_state building and task_state spawning. [ 1808.889044] env[60788]: DEBUG nova.compute.manager [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Received event network-changed-5853dba2-8f48-4d5c-a1bc-de01dca51ae1 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1808.889210] env[60788]: DEBUG nova.compute.manager [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Refreshing instance network info cache due to event network-changed-5853dba2-8f48-4d5c-a1bc-de01dca51ae1. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1808.889396] env[60788]: DEBUG oslo_concurrency.lockutils [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] Acquiring lock "refresh_cache-dbf41f65-ac34-4da6-837d-9d4e924fcf7c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1808.889536] env[60788]: DEBUG oslo_concurrency.lockutils [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] Acquired lock "refresh_cache-dbf41f65-ac34-4da6-837d-9d4e924fcf7c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1808.889691] env[60788]: DEBUG nova.network.neutron [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Refreshing network info cache for port 5853dba2-8f48-4d5c-a1bc-de01dca51ae1 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1808.896642] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1808.896642] env[60788]: value = "task-2205283" [ 1808.896642] env[60788]: _type = "Task" [ 1808.896642] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1808.908062] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205283, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1809.232168] env[60788]: DEBUG nova.network.neutron [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Updated VIF entry in instance network info cache for port 5853dba2-8f48-4d5c-a1bc-de01dca51ae1. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1809.232531] env[60788]: DEBUG nova.network.neutron [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Updating instance_info_cache with network_info: [{"id": "5853dba2-8f48-4d5c-a1bc-de01dca51ae1", "address": "fa:16:3e:94:e9:dc", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5853dba2-8f", "ovs_interfaceid": "5853dba2-8f48-4d5c-a1bc-de01dca51ae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1809.242091] env[60788]: DEBUG oslo_concurrency.lockutils [req-b4a5b321-b9c7-42e8-b8e8-52098fd6fbed req-46803682-4997-4b5c-b8ed-c5b3e89dd8d7 service nova] Releasing lock "refresh_cache-dbf41f65-ac34-4da6-837d-9d4e924fcf7c" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1809.406079] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205283, 'name': CreateVM_Task, 'duration_secs': 0.287417} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1809.406292] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1809.413517] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1809.413687] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1809.413993] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1809.414253] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fcb3dddc-d741-4fb7-a76c-f6631cef4575 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1809.418586] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 1809.418586] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]522db8f7-1403-5601-c753-15ae2f4aa7df" [ 1809.418586] env[60788]: _type = "Task" [ 1809.418586] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1809.426219] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]522db8f7-1403-5601-c753-15ae2f4aa7df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1809.753686] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1809.753981] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1809.754176] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1809.777037] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777193] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777324] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777452] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777575] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777697] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777836] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.777966] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.778100] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.778219] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1809.778338] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1809.778780] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1809.778952] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1809.929041] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1809.929268] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1809.929470] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1810.753355] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.753552] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1810.753752] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1810.766587] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1810.766830] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1810.767009] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1810.767401] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1810.768485] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae8795f-d34d-4324-a771-336f42291c09 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1810.777361] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a116395f-7e9c-4cdd-9f12-4c35f3dc1d3a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1810.792280] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-603e52c2-bd77-4cf6-81fc-e33e1df378e2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1810.798712] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa6533d9-f573-4b89-9ad4-c848d418597a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1810.829313] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181266MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1810.829437] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1810.829619] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1810.921055] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance fb532f8b-5323-4f7a-be64-c6076a1862ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921203] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921203] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921381] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921444] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921592] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921728] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921845] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.921961] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.922091] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1810.932575] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1810.932809] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1810.932957] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1811.064459] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a8217d-b5f4-4333-9d65-e8f851d6df22 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.072217] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fdc3adf-cd25-4af3-9f97-830150115112 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.101912] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4df1ca34-08c5-4323-8e1d-6ee0abb820cc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.108722] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64507439-6843-4454-a5bd-c490a0e08d25 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1811.121611] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1811.130066] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1811.143409] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1811.143583] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1813.139413] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1813.754180] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1854.857170] env[60788]: WARNING oslo_vmware.rw_handles [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1854.857170] env[60788]: ERROR oslo_vmware.rw_handles [ 1854.858109] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1854.859617] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1854.859899] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Copying Virtual Disk [datastore2] vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/fa6b5f3a-9836-472a-90d2-b54ef68970b6/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1854.860215] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f718ce2b-81d7-456b-9438-66cb72663cb6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1854.868104] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Waiting for the task: (returnval){ [ 1854.868104] env[60788]: value = "task-2205284" [ 1854.868104] env[60788]: _type = "Task" [ 1854.868104] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1854.876642] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Task: {'id': task-2205284, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.379106] env[60788]: DEBUG oslo_vmware.exceptions [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1855.379440] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1855.380098] env[60788]: ERROR nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1855.380098] env[60788]: Faults: ['InvalidArgument'] [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Traceback (most recent call last): [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] yield resources [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self.driver.spawn(context, instance, image_meta, [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self._fetch_image_if_missing(context, vi) [ 1855.380098] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] image_cache(vi, tmp_image_ds_loc) [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] vm_util.copy_virtual_disk( [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] session._wait_for_task(vmdk_copy_task) [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] return self.wait_for_task(task_ref) [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] return evt.wait() [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] result = hub.switch() [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1855.380607] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] return self.greenlet.switch() [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self.f(*self.args, **self.kw) [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] raise exceptions.translate_fault(task_info.error) [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Faults: ['InvalidArgument'] [ 1855.381039] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] [ 1855.381039] env[60788]: INFO nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Terminating instance [ 1855.382094] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1855.382305] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1855.382542] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7adda77d-20a0-459a-a2fc-750db1351781 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.384905] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1855.385106] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1855.385843] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-770fe88a-5a88-4473-bb82-c0c53f416282 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.392434] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1855.392648] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8e882e1c-88a9-4861-8955-a2cf947f87a3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.394718] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1855.394890] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1855.395830] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a5ec6d5d-1b86-4d63-b178-7444fc3ec5df {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.400469] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 1855.400469] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5299610f-9ab1-3f29-40e6-bb549e3d5537" [ 1855.400469] env[60788]: _type = "Task" [ 1855.400469] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.407344] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5299610f-9ab1-3f29-40e6-bb549e3d5537, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.465693] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1855.465935] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1855.466133] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Deleting the datastore file [datastore2] fb532f8b-5323-4f7a-be64-c6076a1862ae {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1855.466410] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1a94309a-41af-4ac6-a9a6-3e07eda31e78 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.476866] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Waiting for the task: (returnval){ [ 1855.476866] env[60788]: value = "task-2205286" [ 1855.476866] env[60788]: _type = "Task" [ 1855.476866] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.484869] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Task: {'id': task-2205286, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1855.945282] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1855.945282] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating directory with path [datastore2] vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1855.945282] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7da9bd97-72b9-4fb0-b30a-1617190b39d8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.945282] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Created directory with path [datastore2] vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1855.945665] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Fetch image to [datastore2] vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1855.945665] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1855.945665] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aada416-18e1-405b-8461-b0272826bf01 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.945665] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c16c0b1a-8fc6-408e-8670-87728dda9611 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.945665] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aad25f46-2ada-437d-8e5e-97e3306bae5f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.974899] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb2dc83c-2ba6-4480-88e4-0211e748ba73 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.985649] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d8f03b5a-1a5a-464f-a63d-b4f16ac3c0ad {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.987288] env[60788]: DEBUG oslo_vmware.api [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Task: {'id': task-2205286, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074594} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1855.987524] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1855.987703] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1855.987865] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1855.988071] env[60788]: INFO nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1855.990114] env[60788]: DEBUG nova.compute.claims [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1855.990296] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.990511] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1856.010018] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1856.064207] env[60788]: DEBUG oslo_vmware.rw_handles [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1856.123478] env[60788]: DEBUG oslo_vmware.rw_handles [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1856.123673] env[60788]: DEBUG oslo_vmware.rw_handles [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1856.222858] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74563252-f9f9-4553-a1f2-bf6785e66e16 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.230129] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0db0ce09-04b6-49ba-a190-075fbe2cb2ee {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.259875] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-557063eb-3a29-4a3e-b96a-af600e90f94e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.266564] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cd4d509-2cbf-4c69-869c-4348e5eaa6e6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.279379] env[60788]: DEBUG nova.compute.provider_tree [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1856.287958] env[60788]: DEBUG nova.scheduler.client.report [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1856.301341] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.311s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1856.301887] env[60788]: ERROR nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1856.301887] env[60788]: Faults: ['InvalidArgument'] [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Traceback (most recent call last): [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self.driver.spawn(context, instance, image_meta, [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self._fetch_image_if_missing(context, vi) [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] image_cache(vi, tmp_image_ds_loc) [ 1856.301887] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] vm_util.copy_virtual_disk( [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] session._wait_for_task(vmdk_copy_task) [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] return self.wait_for_task(task_ref) [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] return evt.wait() [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] result = hub.switch() [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] return self.greenlet.switch() [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1856.302305] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] self.f(*self.args, **self.kw) [ 1856.302720] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1856.302720] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] raise exceptions.translate_fault(task_info.error) [ 1856.302720] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1856.302720] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Faults: ['InvalidArgument'] [ 1856.302720] env[60788]: ERROR nova.compute.manager [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] [ 1856.302720] env[60788]: DEBUG nova.compute.utils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1856.303968] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Build of instance fb532f8b-5323-4f7a-be64-c6076a1862ae was re-scheduled: A specified parameter was not correct: fileType [ 1856.303968] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1856.304352] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1856.304520] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1856.304689] env[60788]: DEBUG nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1856.304847] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1856.710570] env[60788]: DEBUG nova.network.neutron [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1856.724434] env[60788]: INFO nova.compute.manager [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Took 0.42 seconds to deallocate network for instance. [ 1856.823742] env[60788]: INFO nova.scheduler.client.report [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Deleted allocations for instance fb532f8b-5323-4f7a-be64-c6076a1862ae [ 1856.848333] env[60788]: DEBUG oslo_concurrency.lockutils [None req-876fbd3d-8c5c-4a12-8f75-3e6a1e58f597 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 682.229s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1856.849634] env[60788]: DEBUG oslo_concurrency.lockutils [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 485.837s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1856.849867] env[60788]: DEBUG oslo_concurrency.lockutils [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Acquiring lock "fb532f8b-5323-4f7a-be64-c6076a1862ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1856.850091] env[60788]: DEBUG oslo_concurrency.lockutils [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1856.850269] env[60788]: DEBUG oslo_concurrency.lockutils [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1856.852383] env[60788]: INFO nova.compute.manager [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Terminating instance [ 1856.855118] env[60788]: DEBUG nova.compute.manager [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1856.855118] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1856.855268] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c287f1e2-94c3-4ad1-82bf-a7f17bc37c98 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.866139] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7102901-ea36-49c8-b1cf-2cdb16105bab {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.878470] env[60788]: DEBUG nova.compute.manager [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1856.899225] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fb532f8b-5323-4f7a-be64-c6076a1862ae could not be found. [ 1856.899436] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1856.899616] env[60788]: INFO nova.compute.manager [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1856.899872] env[60788]: DEBUG oslo.service.loopingcall [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1856.900131] env[60788]: DEBUG nova.compute.manager [-] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1856.900256] env[60788]: DEBUG nova.network.neutron [-] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1856.928531] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1856.928782] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1856.930313] env[60788]: INFO nova.compute.claims [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1856.945027] env[60788]: DEBUG nova.network.neutron [-] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1856.966306] env[60788]: INFO nova.compute.manager [-] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] Took 0.07 seconds to deallocate network for instance. [ 1857.055092] env[60788]: DEBUG oslo_concurrency.lockutils [None req-49f53f02-8134-4d6a-9ba2-8eedbe35ec58 tempest-ServerActionsTestOtherA-732537322 tempest-ServerActionsTestOtherA-732537322-project-member] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.205s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.056085] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 174.305s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1857.056278] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: fb532f8b-5323-4f7a-be64-c6076a1862ae] During sync_power_state the instance has a pending task (deleting). Skip. [ 1857.056459] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "fb532f8b-5323-4f7a-be64-c6076a1862ae" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.121963] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3184dbe-add7-4a6d-9cce-8d3bf59817a1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.129799] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b69acb68-682f-4113-a750-7e7826a31315 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.160951] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb94e628-20c8-4888-8338-bd3f93e6650d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.170784] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea0f092e-6f5d-4c59-bfb5-191458b1ae97 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.183768] env[60788]: DEBUG nova.compute.provider_tree [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1857.192665] env[60788]: DEBUG nova.scheduler.client.report [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1857.208075] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.208556] env[60788]: DEBUG nova.compute.manager [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1857.240166] env[60788]: DEBUG nova.compute.utils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1857.241707] env[60788]: DEBUG nova.compute.manager [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1857.241896] env[60788]: DEBUG nova.network.neutron [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1857.250218] env[60788]: DEBUG nova.compute.manager [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1857.304779] env[60788]: DEBUG nova.policy [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a76df7c693b24512b3f6e13f0e279cc8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b0531506ed4843b80fbcf3c09c73aa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1857.312281] env[60788]: DEBUG nova.compute.manager [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1857.340122] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1857.340427] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1857.340591] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1857.340772] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1857.340918] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1857.341083] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1857.341295] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1857.341451] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1857.341616] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1857.341775] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1857.341945] env[60788]: DEBUG nova.virt.hardware [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1857.342841] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86ad243e-17ce-4600-b19a-4dce9ae1e9bd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.351173] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5b9f211-45fd-4477-81e3-26d987287f97 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.591812] env[60788]: DEBUG nova.network.neutron [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Successfully created port: e047c1f0-9385-4831-ba90-cc9bcfb1cde8 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1858.422659] env[60788]: DEBUG nova.network.neutron [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Successfully updated port: e047c1f0-9385-4831-ba90-cc9bcfb1cde8 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1858.434819] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "refresh_cache-6df14da6-6e82-4573-8dc3-27f8349e586f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1858.434973] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired lock "refresh_cache-6df14da6-6e82-4573-8dc3-27f8349e586f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1858.435142] env[60788]: DEBUG nova.network.neutron [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1858.472147] env[60788]: DEBUG nova.network.neutron [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1858.620831] env[60788]: DEBUG nova.network.neutron [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Updating instance_info_cache with network_info: [{"id": "e047c1f0-9385-4831-ba90-cc9bcfb1cde8", "address": "fa:16:3e:ad:28:8f", "network": {"id": "73db1047-0c76-4640-8949-602913ca4a2c", "bridge": "br-int", "label": "tempest-ServersTestJSON-360084334-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70b0531506ed4843b80fbcf3c09c73aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9aa05ef8-c7bb-4af5-983f-bfa0f3f88223", "external-id": "nsx-vlan-transportzone-135", "segmentation_id": 135, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape047c1f0-93", "ovs_interfaceid": "e047c1f0-9385-4831-ba90-cc9bcfb1cde8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1858.634162] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Releasing lock "refresh_cache-6df14da6-6e82-4573-8dc3-27f8349e586f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1858.634413] env[60788]: DEBUG nova.compute.manager [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Instance network_info: |[{"id": "e047c1f0-9385-4831-ba90-cc9bcfb1cde8", "address": "fa:16:3e:ad:28:8f", "network": {"id": "73db1047-0c76-4640-8949-602913ca4a2c", "bridge": "br-int", "label": "tempest-ServersTestJSON-360084334-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70b0531506ed4843b80fbcf3c09c73aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9aa05ef8-c7bb-4af5-983f-bfa0f3f88223", "external-id": "nsx-vlan-transportzone-135", "segmentation_id": 135, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape047c1f0-93", "ovs_interfaceid": "e047c1f0-9385-4831-ba90-cc9bcfb1cde8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1858.634792] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:28:8f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9aa05ef8-c7bb-4af5-983f-bfa0f3f88223', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e047c1f0-9385-4831-ba90-cc9bcfb1cde8', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1858.642584] env[60788]: DEBUG oslo.service.loopingcall [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1858.643019] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1858.643244] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-992a119f-0a5c-4217-8de7-746d9c0f9872 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1858.664234] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1858.664234] env[60788]: value = "task-2205287" [ 1858.664234] env[60788]: _type = "Task" [ 1858.664234] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1858.671612] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205287, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1858.802109] env[60788]: DEBUG nova.compute.manager [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Received event network-vif-plugged-e047c1f0-9385-4831-ba90-cc9bcfb1cde8 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1858.802333] env[60788]: DEBUG oslo_concurrency.lockutils [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] Acquiring lock "6df14da6-6e82-4573-8dc3-27f8349e586f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1858.802586] env[60788]: DEBUG oslo_concurrency.lockutils [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] Lock "6df14da6-6e82-4573-8dc3-27f8349e586f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1858.802786] env[60788]: DEBUG oslo_concurrency.lockutils [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] Lock "6df14da6-6e82-4573-8dc3-27f8349e586f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1858.802990] env[60788]: DEBUG nova.compute.manager [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] No waiting events found dispatching network-vif-plugged-e047c1f0-9385-4831-ba90-cc9bcfb1cde8 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1858.803366] env[60788]: WARNING nova.compute.manager [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Received unexpected event network-vif-plugged-e047c1f0-9385-4831-ba90-cc9bcfb1cde8 for instance with vm_state building and task_state spawning. [ 1858.803569] env[60788]: DEBUG nova.compute.manager [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Received event network-changed-e047c1f0-9385-4831-ba90-cc9bcfb1cde8 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1858.803767] env[60788]: DEBUG nova.compute.manager [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Refreshing instance network info cache due to event network-changed-e047c1f0-9385-4831-ba90-cc9bcfb1cde8. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1858.804049] env[60788]: DEBUG oslo_concurrency.lockutils [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] Acquiring lock "refresh_cache-6df14da6-6e82-4573-8dc3-27f8349e586f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1858.804330] env[60788]: DEBUG oslo_concurrency.lockutils [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] Acquired lock "refresh_cache-6df14da6-6e82-4573-8dc3-27f8349e586f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1858.804514] env[60788]: DEBUG nova.network.neutron [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Refreshing network info cache for port e047c1f0-9385-4831-ba90-cc9bcfb1cde8 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1859.174923] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205287, 'name': CreateVM_Task, 'duration_secs': 0.287403} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1859.175112] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1859.175734] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1859.175902] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1859.176243] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1859.176500] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f1b419aa-2d0e-4977-8df0-ff917e1a68a9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1859.181258] env[60788]: DEBUG oslo_vmware.api [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for the task: (returnval){ [ 1859.181258] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52e3fdd9-0e26-46cc-85fa-295d19b688e0" [ 1859.181258] env[60788]: _type = "Task" [ 1859.181258] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1859.188865] env[60788]: DEBUG oslo_vmware.api [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52e3fdd9-0e26-46cc-85fa-295d19b688e0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1859.358361] env[60788]: DEBUG nova.network.neutron [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Updated VIF entry in instance network info cache for port e047c1f0-9385-4831-ba90-cc9bcfb1cde8. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1859.358714] env[60788]: DEBUG nova.network.neutron [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Updating instance_info_cache with network_info: [{"id": "e047c1f0-9385-4831-ba90-cc9bcfb1cde8", "address": "fa:16:3e:ad:28:8f", "network": {"id": "73db1047-0c76-4640-8949-602913ca4a2c", "bridge": "br-int", "label": "tempest-ServersTestJSON-360084334-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70b0531506ed4843b80fbcf3c09c73aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9aa05ef8-c7bb-4af5-983f-bfa0f3f88223", "external-id": "nsx-vlan-transportzone-135", "segmentation_id": 135, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape047c1f0-93", "ovs_interfaceid": "e047c1f0-9385-4831-ba90-cc9bcfb1cde8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1859.370356] env[60788]: DEBUG oslo_concurrency.lockutils [req-83565c55-3ba7-43fd-bde9-ea43e7a410fc req-70df8984-9b0e-43b4-bae2-efa6dd7b16bf service nova] Releasing lock "refresh_cache-6df14da6-6e82-4573-8dc3-27f8349e586f" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1859.691485] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1859.691755] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1859.691970] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1864.156043] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "4864273c-b505-4e31-bf7b-633ba1e99562" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1867.754034] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1868.753798] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1869.756049] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1869.756049] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1869.756049] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1869.776421] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.777261] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.777261] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.777261] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.777444] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.777444] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.777530] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.778320] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.778320] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.778320] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1869.778320] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1869.778519] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1869.778663] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1870.753780] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1870.753967] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1870.754173] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1870.765431] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.765741] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1870.765887] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1870.766056] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1870.770140] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9814f97-2251-411d-b47b-8be3462c5b2f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.778595] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ad0928-2293-4126-a5e2-7e3d24bc4f90 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.793850] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d754da86-92dc-4757-abed-08c5f4944ae9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.800724] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2115d86b-b597-44cf-b4ac-cdc69af66859 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.829169] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181267MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1870.829350] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.829555] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1870.900999] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.901228] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.901374] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.901546] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.901690] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.901832] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.901958] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.902109] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.902270] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.902401] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1870.902617] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1870.902786] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1871.031120] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-466b3d4b-6380-45cd-872a-04ad93fd6fcf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.038814] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-297cbb80-cfd6-4e9f-94e9-ffb6e23ce230 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.071017] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc871c20-54b7-4e32-9c4a-fe2508dc759d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.078096] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8120db2e-e2df-4907-a2ec-2c0c89bcd969 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.091261] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1871.099033] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1871.116298] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1871.116482] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1873.111667] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1874.753634] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1902.634182] env[60788]: WARNING oslo_vmware.rw_handles [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1902.634182] env[60788]: ERROR oslo_vmware.rw_handles [ 1902.634951] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1902.636545] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1902.636792] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Copying Virtual Disk [datastore2] vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/d0f0e3d0-62ec-4dd2-9789-44e01ec87d28/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1902.637103] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ba1b4ff2-c75d-44e5-b5a1-a69f6491bc8a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.645303] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 1902.645303] env[60788]: value = "task-2205288" [ 1902.645303] env[60788]: _type = "Task" [ 1902.645303] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1902.653497] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': task-2205288, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.155434] env[60788]: DEBUG oslo_vmware.exceptions [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1903.155724] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1903.156293] env[60788]: ERROR nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1903.156293] env[60788]: Faults: ['InvalidArgument'] [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Traceback (most recent call last): [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] yield resources [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self.driver.spawn(context, instance, image_meta, [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self._fetch_image_if_missing(context, vi) [ 1903.156293] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] image_cache(vi, tmp_image_ds_loc) [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] vm_util.copy_virtual_disk( [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] session._wait_for_task(vmdk_copy_task) [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] return self.wait_for_task(task_ref) [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] return evt.wait() [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] result = hub.switch() [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1903.156736] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] return self.greenlet.switch() [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self.f(*self.args, **self.kw) [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] raise exceptions.translate_fault(task_info.error) [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Faults: ['InvalidArgument'] [ 1903.157145] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] [ 1903.157145] env[60788]: INFO nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Terminating instance [ 1903.158476] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1903.158674] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1903.158907] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9d858701-2a0c-4f6f-90fb-621db85d8755 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.161246] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1903.161440] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1903.162151] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26d2ee42-7980-4b94-a99f-cc977eba4b63 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.168637] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1903.168841] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2c3fff24-602c-4b01-b108-6e69fb6b86b8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.170939] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1903.171141] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1903.172093] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0e34291d-a873-4201-a576-8289e39622e9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.176761] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1903.176761] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52504ca4-31d7-3d4d-691e-d101c126c7ae" [ 1903.176761] env[60788]: _type = "Task" [ 1903.176761] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1903.183796] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52504ca4-31d7-3d4d-691e-d101c126c7ae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.247454] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1903.247674] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1903.247853] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Deleting the datastore file [datastore2] f08a350c-54b6-44ce-bb3f-b9ab5deacf9d {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1903.248142] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-54768759-8233-47c1-a2d5-9280a3f89aee {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.254909] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 1903.254909] env[60788]: value = "task-2205290" [ 1903.254909] env[60788]: _type = "Task" [ 1903.254909] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1903.262248] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': task-2205290, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.687369] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1903.687697] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1903.687819] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f99d4da-d847-48dd-b875-ce58fc82fbfd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.698372] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1903.698551] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Fetch image to [datastore2] vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1903.698717] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1903.699435] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1571ad89-ed24-45ae-a80f-da2fd3bdb403 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.706583] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f164a70c-138d-432e-ac31-69009b705b4a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.717164] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d74d89bb-75de-4668-a1a1-21bb7f325b20 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.747469] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a34a034e-4c07-44af-90a9-b1b7b961202d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.753041] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dec5b379-aa2a-45a3-9ab1-d6b5c8bdc77e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.763612] env[60788]: DEBUG oslo_vmware.api [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': task-2205290, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073205} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1903.763837] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1903.764025] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1903.764201] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1903.764377] env[60788]: INFO nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1903.766430] env[60788]: DEBUG nova.compute.claims [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1903.766596] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1903.766817] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1903.774367] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1903.826943] env[60788]: DEBUG oslo_vmware.rw_handles [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1903.891977] env[60788]: DEBUG oslo_vmware.rw_handles [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1903.892199] env[60788]: DEBUG oslo_vmware.rw_handles [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1903.996544] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16588437-2175-4c9d-b123-0ace3c7fe347 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.004291] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc0b13ee-d512-4646-a048-8361910ef8a7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.035987] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-945715fd-8c3f-4ce8-810d-4c6487c132f8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.043300] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddc3d334-c3aa-4145-9897-18d58e2e3c86 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.056440] env[60788]: DEBUG nova.compute.provider_tree [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1904.065146] env[60788]: DEBUG nova.scheduler.client.report [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1904.080018] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.313s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.080552] env[60788]: ERROR nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1904.080552] env[60788]: Faults: ['InvalidArgument'] [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Traceback (most recent call last): [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self.driver.spawn(context, instance, image_meta, [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self._fetch_image_if_missing(context, vi) [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] image_cache(vi, tmp_image_ds_loc) [ 1904.080552] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] vm_util.copy_virtual_disk( [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] session._wait_for_task(vmdk_copy_task) [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] return self.wait_for_task(task_ref) [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] return evt.wait() [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] result = hub.switch() [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] return self.greenlet.switch() [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1904.080903] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] self.f(*self.args, **self.kw) [ 1904.081503] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1904.081503] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] raise exceptions.translate_fault(task_info.error) [ 1904.081503] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1904.081503] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Faults: ['InvalidArgument'] [ 1904.081503] env[60788]: ERROR nova.compute.manager [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] [ 1904.081503] env[60788]: DEBUG nova.compute.utils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1904.082634] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Build of instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d was re-scheduled: A specified parameter was not correct: fileType [ 1904.082634] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1904.083007] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1904.083190] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1904.083357] env[60788]: DEBUG nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1904.083518] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1904.403470] env[60788]: DEBUG nova.network.neutron [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.413929] env[60788]: INFO nova.compute.manager [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Took 0.33 seconds to deallocate network for instance. [ 1904.502191] env[60788]: INFO nova.scheduler.client.report [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Deleted allocations for instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d [ 1904.525888] env[60788]: DEBUG oslo_concurrency.lockutils [None req-416462d7-c584-49eb-9d9e-f2967c0548b2 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.410s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.526184] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.279s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.526407] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.526614] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.526982] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.531744] env[60788]: INFO nova.compute.manager [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Terminating instance [ 1904.533554] env[60788]: DEBUG nova.compute.manager [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1904.533752] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1904.534027] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-25389049-b428-42d1-8993-adf4efff34f1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.543550] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a7245f0-953f-4192-9313-54fe475da39b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.572412] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f08a350c-54b6-44ce-bb3f-b9ab5deacf9d could not be found. [ 1904.572627] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1904.572806] env[60788]: INFO nova.compute.manager [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1904.573058] env[60788]: DEBUG oslo.service.loopingcall [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1904.573281] env[60788]: DEBUG nova.compute.manager [-] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1904.573382] env[60788]: DEBUG nova.network.neutron [-] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1904.613574] env[60788]: DEBUG nova.network.neutron [-] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.622691] env[60788]: INFO nova.compute.manager [-] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] Took 0.05 seconds to deallocate network for instance. [ 1904.705684] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ccfed590-1d76-4da1-9a47-0886fec173f5 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.706684] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 221.955s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.706684] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: f08a350c-54b6-44ce-bb3f-b9ab5deacf9d] During sync_power_state the instance has a pending task (deleting). Skip. [ 1904.706849] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "f08a350c-54b6-44ce-bb3f-b9ab5deacf9d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1915.757010] env[60788]: DEBUG oslo_concurrency.lockutils [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "db89c7e8-6d81-4c0a-9111-9f6256588967" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.429225] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "c4697916-5d18-4d2b-9e12-91801de44580" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.429528] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "c4697916-5d18-4d2b-9e12-91801de44580" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1921.439923] env[60788]: DEBUG nova.compute.manager [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 1921.498454] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1921.498754] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1921.500207] env[60788]: INFO nova.compute.claims [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1921.664489] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ad1a64d-e931-4d92-a793-f89bdfaa6186 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.672232] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d37c4780-c1d6-43d2-9bc9-2a248e26fd47 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.704593] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a1bf73-1262-4ad9-aabe-e3211f00d1d9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.711927] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9467072e-5129-4f9e-9c04-2b9b9538efc1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.725577] env[60788]: DEBUG nova.compute.provider_tree [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1921.734723] env[60788]: DEBUG nova.scheduler.client.report [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1921.748495] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1921.749143] env[60788]: DEBUG nova.compute.manager [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 1921.780804] env[60788]: DEBUG nova.compute.utils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1921.782212] env[60788]: DEBUG nova.compute.manager [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 1921.782378] env[60788]: DEBUG nova.network.neutron [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1921.791424] env[60788]: DEBUG nova.compute.manager [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 1921.857715] env[60788]: DEBUG nova.compute.manager [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 1921.871279] env[60788]: DEBUG nova.policy [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89b673319ed34de9859c0f58f1c616c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d4606e74dad40acba2d78ea01a69919', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 1921.894304] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1921.894565] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1921.894725] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1921.894905] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1921.895066] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1921.895221] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1921.895429] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1921.895590] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1921.895755] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1921.895916] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1921.896106] env[60788]: DEBUG nova.virt.hardware [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1921.897250] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f38ff70d-f5f2-419b-bf40-1cde9302daef {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1921.905882] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac70bb83-da38-4512-aaab-845e031f82c8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.199114] env[60788]: DEBUG nova.network.neutron [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Successfully created port: d4b2d3c8-5f64-45f0-a729-1f2e63601a5c {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1923.220032] env[60788]: DEBUG nova.compute.manager [req-df477037-7c1b-40f8-8eab-567d1a2df501 req-2fa38424-c689-4780-b42c-edcf41ef2b92 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Received event network-vif-plugged-d4b2d3c8-5f64-45f0-a729-1f2e63601a5c {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1923.220274] env[60788]: DEBUG oslo_concurrency.lockutils [req-df477037-7c1b-40f8-8eab-567d1a2df501 req-2fa38424-c689-4780-b42c-edcf41ef2b92 service nova] Acquiring lock "c4697916-5d18-4d2b-9e12-91801de44580-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1923.220425] env[60788]: DEBUG oslo_concurrency.lockutils [req-df477037-7c1b-40f8-8eab-567d1a2df501 req-2fa38424-c689-4780-b42c-edcf41ef2b92 service nova] Lock "c4697916-5d18-4d2b-9e12-91801de44580-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1923.220562] env[60788]: DEBUG oslo_concurrency.lockutils [req-df477037-7c1b-40f8-8eab-567d1a2df501 req-2fa38424-c689-4780-b42c-edcf41ef2b92 service nova] Lock "c4697916-5d18-4d2b-9e12-91801de44580-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1923.220665] env[60788]: DEBUG nova.compute.manager [req-df477037-7c1b-40f8-8eab-567d1a2df501 req-2fa38424-c689-4780-b42c-edcf41ef2b92 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] No waiting events found dispatching network-vif-plugged-d4b2d3c8-5f64-45f0-a729-1f2e63601a5c {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1923.220830] env[60788]: WARNING nova.compute.manager [req-df477037-7c1b-40f8-8eab-567d1a2df501 req-2fa38424-c689-4780-b42c-edcf41ef2b92 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Received unexpected event network-vif-plugged-d4b2d3c8-5f64-45f0-a729-1f2e63601a5c for instance with vm_state building and task_state spawning. [ 1923.264124] env[60788]: DEBUG nova.network.neutron [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Successfully updated port: d4b2d3c8-5f64-45f0-a729-1f2e63601a5c {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1923.277121] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "refresh_cache-c4697916-5d18-4d2b-9e12-91801de44580" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1923.277357] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "refresh_cache-c4697916-5d18-4d2b-9e12-91801de44580" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1923.277446] env[60788]: DEBUG nova.network.neutron [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1923.313746] env[60788]: DEBUG nova.network.neutron [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1923.579014] env[60788]: DEBUG nova.network.neutron [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Updating instance_info_cache with network_info: [{"id": "d4b2d3c8-5f64-45f0-a729-1f2e63601a5c", "address": "fa:16:3e:9b:f1:33", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd4b2d3c8-5f", "ovs_interfaceid": "d4b2d3c8-5f64-45f0-a729-1f2e63601a5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1923.591229] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "refresh_cache-c4697916-5d18-4d2b-9e12-91801de44580" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1923.591685] env[60788]: DEBUG nova.compute.manager [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Instance network_info: |[{"id": "d4b2d3c8-5f64-45f0-a729-1f2e63601a5c", "address": "fa:16:3e:9b:f1:33", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd4b2d3c8-5f", "ovs_interfaceid": "d4b2d3c8-5f64-45f0-a729-1f2e63601a5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 1923.592108] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:f1:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd4b2d3c8-5f64-45f0-a729-1f2e63601a5c', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1923.599796] env[60788]: DEBUG oslo.service.loopingcall [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1923.600345] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1923.600566] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b1c169ac-a78c-4568-bf3d-65ac0f0421e9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.622244] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1923.622244] env[60788]: value = "task-2205291" [ 1923.622244] env[60788]: _type = "Task" [ 1923.622244] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1923.630185] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205291, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1924.132300] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205291, 'name': CreateVM_Task, 'duration_secs': 0.304014} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1924.132509] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1924.133176] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1924.133322] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1924.133637] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1924.133879] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-35b1e550-440a-440a-8604-14491b45ebb2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.138093] env[60788]: DEBUG oslo_vmware.api [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 1924.138093] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5212e2f7-9f4f-2f17-c17f-9398e1aef7d1" [ 1924.138093] env[60788]: _type = "Task" [ 1924.138093] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1924.145358] env[60788]: DEBUG oslo_vmware.api [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5212e2f7-9f4f-2f17-c17f-9398e1aef7d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1924.651144] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1924.651487] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1924.651775] env[60788]: DEBUG oslo_concurrency.lockutils [None req-87baf288-9e1c-49a3-ac7b-97998d71d271 tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1924.749554] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1925.244169] env[60788]: DEBUG nova.compute.manager [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Received event network-changed-d4b2d3c8-5f64-45f0-a729-1f2e63601a5c {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 1925.244385] env[60788]: DEBUG nova.compute.manager [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Refreshing instance network info cache due to event network-changed-d4b2d3c8-5f64-45f0-a729-1f2e63601a5c. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 1925.244598] env[60788]: DEBUG oslo_concurrency.lockutils [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] Acquiring lock "refresh_cache-c4697916-5d18-4d2b-9e12-91801de44580" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1925.244744] env[60788]: DEBUG oslo_concurrency.lockutils [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] Acquired lock "refresh_cache-c4697916-5d18-4d2b-9e12-91801de44580" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1925.244902] env[60788]: DEBUG nova.network.neutron [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Refreshing network info cache for port d4b2d3c8-5f64-45f0-a729-1f2e63601a5c {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1925.557729] env[60788]: DEBUG nova.network.neutron [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Updated VIF entry in instance network info cache for port d4b2d3c8-5f64-45f0-a729-1f2e63601a5c. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1925.558113] env[60788]: DEBUG nova.network.neutron [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Updating instance_info_cache with network_info: [{"id": "d4b2d3c8-5f64-45f0-a729-1f2e63601a5c", "address": "fa:16:3e:9b:f1:33", "network": {"id": "060fb5b2-be67-4add-b444-073bcf1af6f4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1392302374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d4606e74dad40acba2d78ea01a69919", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "33bcfd29-ad69-41ad-8e7f-55c1a3cf2dce", "external-id": "nsx-vlan-transportzone-725", "segmentation_id": 725, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd4b2d3c8-5f", "ovs_interfaceid": "d4b2d3c8-5f64-45f0-a729-1f2e63601a5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1925.567917] env[60788]: DEBUG oslo_concurrency.lockutils [req-5889ff02-8fff-495a-ba43-a7c176761df7 req-e1a3eecc-b759-4ae7-9b45-4e2a42513198 service nova] Releasing lock "refresh_cache-c4697916-5d18-4d2b-9e12-91801de44580" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1928.755070] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1929.753794] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1929.754048] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1929.754113] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1929.773641] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.773953] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.773953] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774153] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774332] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774461] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774583] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774705] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774823] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.774941] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1929.775077] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1929.775523] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1929.775689] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1931.754279] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1931.754601] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1931.754601] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1931.754759] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1931.766143] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1931.766354] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1931.766523] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1931.766675] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1931.768217] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8451b8a0-d689-47a0-985b-3f059559b288 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1931.776640] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76268c7b-856b-4448-a9f7-0aa3ed860bf6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1931.791623] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50e7403f-f99f-4fab-a698-a946732d12c8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1931.797722] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6582a9e-9ee7-4e70-a0bf-7a3c380e6f00 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1931.826708] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181197MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1931.826846] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1931.827040] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1931.901601] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 67c365fa-74b8-4a57-abbc-c143990a0292 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.901836] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.902036] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.902228] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.902414] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.902762] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.902762] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.902908] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.903074] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.903246] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1931.903509] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1931.903704] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1932.031648] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4e31474-bb28-4adf-9873-e7d7ecac5b79 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1932.039188] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0592fa07-069b-4869-9635-8794b1b97298 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1932.069679] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa0f288f-057a-4452-a9ea-6dbcedf5a736 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1932.076379] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50526711-5481-476b-b343-3b1c415e9d3b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1932.089345] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1932.097520] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1932.110858] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1932.111049] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1934.106543] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1936.753878] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1952.143090] env[60788]: WARNING oslo_vmware.rw_handles [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1952.143090] env[60788]: ERROR oslo_vmware.rw_handles [ 1952.143815] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1952.145493] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1952.145800] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Copying Virtual Disk [datastore2] vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/b7526a27-a409-46b8-ae87-26a6311dae0a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1952.145967] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0ed36d23-f1fd-45de-9e7d-ba34d5732a41 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.154356] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1952.154356] env[60788]: value = "task-2205292" [ 1952.154356] env[60788]: _type = "Task" [ 1952.154356] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1952.162165] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205292, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1952.664579] env[60788]: DEBUG oslo_vmware.exceptions [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1952.664823] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1952.665368] env[60788]: ERROR nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.665368] env[60788]: Faults: ['InvalidArgument'] [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Traceback (most recent call last): [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] yield resources [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self.driver.spawn(context, instance, image_meta, [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self._fetch_image_if_missing(context, vi) [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1952.665368] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] image_cache(vi, tmp_image_ds_loc) [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] vm_util.copy_virtual_disk( [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] session._wait_for_task(vmdk_copy_task) [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] return self.wait_for_task(task_ref) [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] return evt.wait() [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] result = hub.switch() [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] return self.greenlet.switch() [ 1952.666017] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1952.666605] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self.f(*self.args, **self.kw) [ 1952.666605] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1952.666605] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] raise exceptions.translate_fault(task_info.error) [ 1952.666605] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.666605] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Faults: ['InvalidArgument'] [ 1952.666605] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] [ 1952.666605] env[60788]: INFO nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Terminating instance [ 1952.667226] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1952.667432] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1952.667661] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bab9e2b2-348d-4e54-bdee-97f9f9882e8d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.669713] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1952.669905] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1952.670629] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb7765b1-0de2-4d83-aba8-72356bc7bd62 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.677134] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1952.677352] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4e4aaeed-c8be-4c3f-b070-a2d12838d30c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.679378] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1952.679551] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1952.680480] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c80dab60-2bf0-44b3-a5f3-a9fbd36a1c42 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.685074] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 1952.685074] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52327c36-bd27-f339-967f-2efbd48ab16b" [ 1952.685074] env[60788]: _type = "Task" [ 1952.685074] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1952.692028] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52327c36-bd27-f339-967f-2efbd48ab16b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1952.741360] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1952.741835] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1952.741835] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleting the datastore file [datastore2] 67c365fa-74b8-4a57-abbc-c143990a0292 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1952.742018] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-75922620-7cf7-4cad-b4d4-3aff94f59380 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.747587] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 1952.747587] env[60788]: value = "task-2205294" [ 1952.747587] env[60788]: _type = "Task" [ 1952.747587] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1952.754970] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205294, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1953.197570] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1953.198070] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating directory with path [datastore2] vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1953.198070] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f6e0620-9cae-4d41-b794-60ba5e014759 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.210699] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Created directory with path [datastore2] vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1953.210902] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Fetch image to [datastore2] vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1953.211059] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1953.211813] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f04b5a16-d92c-42ec-8101-20e2ee182a98 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.218182] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89756dee-0281-473f-b434-9def5937d415 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.227189] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-995f422e-7774-43a9-935d-c7b5c35a930d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.261849] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc53efcf-25c1-4cce-9673-d9559002b860 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.269492] env[60788]: DEBUG oslo_vmware.api [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205294, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06714} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1953.271214] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1953.271349] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1953.271979] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1953.271979] env[60788]: INFO nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1953.273721] env[60788]: DEBUG nova.compute.claims [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1953.273886] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1953.274109] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1953.276601] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d2df7749-05a9-4d7a-a2f3-a3eb135eb7ec {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.298805] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1953.352358] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1953.413015] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1953.413240] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1953.495508] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e426d6c5-f66f-4328-b62f-ed4178c904b3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.503287] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b78b6b7a-5054-4021-9050-c273b6dff5c4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.534671] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba588f5e-96e0-496b-abf8-b70af2475a10 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.541830] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-285c2c43-a2ac-4bd2-aea9-fbc01c560310 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.554868] env[60788]: DEBUG nova.compute.provider_tree [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1953.564052] env[60788]: DEBUG nova.scheduler.client.report [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1953.577629] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.578188] env[60788]: ERROR nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1953.578188] env[60788]: Faults: ['InvalidArgument'] [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Traceback (most recent call last): [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self.driver.spawn(context, instance, image_meta, [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self._fetch_image_if_missing(context, vi) [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] image_cache(vi, tmp_image_ds_loc) [ 1953.578188] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] vm_util.copy_virtual_disk( [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] session._wait_for_task(vmdk_copy_task) [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] return self.wait_for_task(task_ref) [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] return evt.wait() [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] result = hub.switch() [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] return self.greenlet.switch() [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1953.578610] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] self.f(*self.args, **self.kw) [ 1953.579009] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1953.579009] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] raise exceptions.translate_fault(task_info.error) [ 1953.579009] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1953.579009] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Faults: ['InvalidArgument'] [ 1953.579009] env[60788]: ERROR nova.compute.manager [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] [ 1953.579009] env[60788]: DEBUG nova.compute.utils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1953.580298] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Build of instance 67c365fa-74b8-4a57-abbc-c143990a0292 was re-scheduled: A specified parameter was not correct: fileType [ 1953.580298] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 1953.580673] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 1953.580847] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 1953.581032] env[60788]: DEBUG nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1953.581273] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1953.838678] env[60788]: DEBUG nova.network.neutron [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1953.849239] env[60788]: INFO nova.compute.manager [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Took 0.27 seconds to deallocate network for instance. [ 1953.949488] env[60788]: INFO nova.scheduler.client.report [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted allocations for instance 67c365fa-74b8-4a57-abbc-c143990a0292 [ 1953.971673] env[60788]: DEBUG oslo_concurrency.lockutils [None req-9150a31a-9856-4012-98d7-7627330ced3a tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "67c365fa-74b8-4a57-abbc-c143990a0292" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 628.114s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.971941] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "67c365fa-74b8-4a57-abbc-c143990a0292" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.931s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1953.972218] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "67c365fa-74b8-4a57-abbc-c143990a0292-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1953.972456] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "67c365fa-74b8-4a57-abbc-c143990a0292-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1953.972629] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "67c365fa-74b8-4a57-abbc-c143990a0292-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.974936] env[60788]: INFO nova.compute.manager [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Terminating instance [ 1953.976648] env[60788]: DEBUG nova.compute.manager [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 1953.976850] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1953.977334] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3d297e0c-a3e7-43ab-8adc-6c19f77edad9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.987028] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7fbf9da-f5be-4b43-b407-403ccdb6a6c0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.014621] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67c365fa-74b8-4a57-abbc-c143990a0292 could not be found. [ 1954.014818] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1954.014989] env[60788]: INFO nova.compute.manager [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1954.015240] env[60788]: DEBUG oslo.service.loopingcall [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1954.015456] env[60788]: DEBUG nova.compute.manager [-] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 1954.015553] env[60788]: DEBUG nova.network.neutron [-] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1954.038501] env[60788]: DEBUG nova.network.neutron [-] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1954.045747] env[60788]: INFO nova.compute.manager [-] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] Took 0.03 seconds to deallocate network for instance. [ 1954.130696] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c295b4e7-cfcd-48cb-b197-75541f77518c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "67c365fa-74b8-4a57-abbc-c143990a0292" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1954.131687] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "67c365fa-74b8-4a57-abbc-c143990a0292" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 271.380s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1954.131891] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 67c365fa-74b8-4a57-abbc-c143990a0292] During sync_power_state the instance has a pending task (deleting). Skip. [ 1954.132093] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "67c365fa-74b8-4a57-abbc-c143990a0292" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1969.611646] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1984.753696] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1988.762218] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.754501] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1989.754716] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 1989.754786] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 1989.773724] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774062] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774062] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774152] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774244] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774369] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774489] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774608] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774724] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 1989.774843] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 1989.775323] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1990.323246] env[60788]: DEBUG oslo_concurrency.lockutils [None req-ecdd7bf3-bce8-4bbf-8c33-979c3b1cfd26 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "6df14da6-6e82-4573-8dc3-27f8349e586f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1991.754313] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1991.754754] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1992.754104] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1992.765743] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1992.766036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1992.766245] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1992.766450] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1992.767575] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01a1bfac-7cfe-4e85-8bcd-d46b4c87e811 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.776361] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01779a18-c749-4cfe-b87b-464fb8e3f5da {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.789937] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cbbc6bc-8443-4aaa-9cd0-15ad1a1a3bed {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.795996] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d8a26f7-7de2-4411-83f7-68fa4655f780 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1992.825382] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181261MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1992.825519] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1992.825700] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1992.913320] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.913496] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.913626] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.913749] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.913870] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.913988] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.914126] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.914281] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.914419] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1992.914609] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1992.914746] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1992.930628] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing inventories for resource provider 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1992.942943] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating ProviderTree inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1992.943168] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1992.954044] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing aggregate associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, aggregates: None {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1992.970464] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing trait associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1993.067864] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a28f0db3-ca6d-45be-9871-14b509db0d41 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.075679] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc89be4-2948-4f9e-b731-fde5e3d84cf9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.104737] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe84be3-7df5-4841-b452-49d82ff7461f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.111282] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b093b8ab-24a5-4588-903f-c3be4d5f79e6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1993.125144] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1993.134506] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1993.147437] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1993.147613] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1993.753502] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1993.753678] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 1993.754065] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1993.754065] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances with incomplete migration {{(pid=60788) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 1994.758437] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1997.754770] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1999.754711] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1999.755101] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 1999.765792] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] There are 0 instances to clean {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 2003.212194] env[60788]: WARNING oslo_vmware.rw_handles [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2003.212194] env[60788]: ERROR oslo_vmware.rw_handles [ 2003.212947] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2003.214726] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2003.214966] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Copying Virtual Disk [datastore2] vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/f478b8f3-c6aa-4493-8ab9-dac074ce72e7/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2003.215263] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d1f38b7e-9ef9-4a2b-9d36-1457f9ce92f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.223538] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 2003.223538] env[60788]: value = "task-2205295" [ 2003.223538] env[60788]: _type = "Task" [ 2003.223538] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.231175] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': task-2205295, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2003.733655] env[60788]: DEBUG oslo_vmware.exceptions [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2003.733961] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2003.734509] env[60788]: ERROR nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2003.734509] env[60788]: Faults: ['InvalidArgument'] [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Traceback (most recent call last): [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] yield resources [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self.driver.spawn(context, instance, image_meta, [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self._fetch_image_if_missing(context, vi) [ 2003.734509] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] image_cache(vi, tmp_image_ds_loc) [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] vm_util.copy_virtual_disk( [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] session._wait_for_task(vmdk_copy_task) [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] return self.wait_for_task(task_ref) [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] return evt.wait() [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] result = hub.switch() [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2003.735153] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] return self.greenlet.switch() [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self.f(*self.args, **self.kw) [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] raise exceptions.translate_fault(task_info.error) [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Faults: ['InvalidArgument'] [ 2003.735724] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] [ 2003.735724] env[60788]: INFO nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Terminating instance [ 2003.737065] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2003.737065] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2003.737065] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6541e6e7-99b2-47f3-b159-e81ba5d8f4fb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.738844] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2003.739032] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2003.739732] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12289dce-dc68-4c91-898f-c0068e8cb126 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.747165] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2003.748123] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-97d6a9d5-c4d2-4cd4-b0de-57422b9524ca {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.749457] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2003.749635] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2003.750376] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ae07f556-934a-4886-a747-f228955c1163 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.755395] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for the task: (returnval){ [ 2003.755395] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]524735ac-5d94-2adb-e337-6cc52b8117fb" [ 2003.755395] env[60788]: _type = "Task" [ 2003.755395] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.768353] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]524735ac-5d94-2adb-e337-6cc52b8117fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2003.809096] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2003.809314] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2003.809493] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Deleting the datastore file [datastore2] e3671c90-83c7-48f3-8b2a-97f34ab2505e {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2003.809752] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c7bd0860-4874-4ba5-a703-ff4e3b2b94db {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.815415] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 2003.815415] env[60788]: value = "task-2205297" [ 2003.815415] env[60788]: _type = "Task" [ 2003.815415] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2003.822884] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': task-2205297, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2004.266444] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2004.266857] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating directory with path [datastore2] vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2004.266902] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1d66fca6-1f53-4968-8773-4d5497b96ca4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.277578] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Created directory with path [datastore2] vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2004.277762] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Fetch image to [datastore2] vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2004.277999] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2004.278729] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52e4ffd4-b811-4b63-959e-276456452c90 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.285125] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01cb3f3b-68f1-4d67-b485-4fbb311d9c89 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.294931] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-398ccb2b-3ae6-4a96-9c57-3ccc411ee0d8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.327663] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35f7a29a-536a-43e4-b6a6-a39ff57fb46e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.334284] env[60788]: DEBUG oslo_vmware.api [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': task-2205297, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073634} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2004.335731] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2004.335927] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2004.336117] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2004.336299] env[60788]: INFO nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2004.338035] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-87265009-9833-42e4-9115-04cd6b857143 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.339857] env[60788]: DEBUG nova.compute.claims [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2004.340043] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2004.340273] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2004.361981] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2004.411927] env[60788]: DEBUG oslo_vmware.rw_handles [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2004.470717] env[60788]: DEBUG oslo_vmware.rw_handles [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2004.470904] env[60788]: DEBUG oslo_vmware.rw_handles [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2004.539615] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d31b6d3d-4f2c-458c-a595-3469842413ae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.547017] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06f39b5b-8582-4830-ae80-612041fb4195 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.578138] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a6b278-900c-439d-bc38-3791e154cbb0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.584987] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f788aaea-a7b3-4354-bee2-12a0268d3633 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.597679] env[60788]: DEBUG nova.compute.provider_tree [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2004.605817] env[60788]: DEBUG nova.scheduler.client.report [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2004.618228] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.278s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2004.618732] env[60788]: ERROR nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2004.618732] env[60788]: Faults: ['InvalidArgument'] [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Traceback (most recent call last): [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self.driver.spawn(context, instance, image_meta, [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self._fetch_image_if_missing(context, vi) [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] image_cache(vi, tmp_image_ds_loc) [ 2004.618732] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] vm_util.copy_virtual_disk( [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] session._wait_for_task(vmdk_copy_task) [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] return self.wait_for_task(task_ref) [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] return evt.wait() [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] result = hub.switch() [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] return self.greenlet.switch() [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2004.619201] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] self.f(*self.args, **self.kw) [ 2004.619635] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2004.619635] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] raise exceptions.translate_fault(task_info.error) [ 2004.619635] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2004.619635] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Faults: ['InvalidArgument'] [ 2004.619635] env[60788]: ERROR nova.compute.manager [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] [ 2004.619635] env[60788]: DEBUG nova.compute.utils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2004.620683] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Build of instance e3671c90-83c7-48f3-8b2a-97f34ab2505e was re-scheduled: A specified parameter was not correct: fileType [ 2004.620683] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2004.621150] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2004.621228] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2004.621393] env[60788]: DEBUG nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2004.621607] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2005.287160] env[60788]: DEBUG nova.network.neutron [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2005.304100] env[60788]: INFO nova.compute.manager [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Took 0.68 seconds to deallocate network for instance. [ 2005.394345] env[60788]: INFO nova.scheduler.client.report [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Deleted allocations for instance e3671c90-83c7-48f3-8b2a-97f34ab2505e [ 2005.414107] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6b252064-e25c-49e1-a543-a86f82b3f3c7 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.203s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.414107] env[60788]: DEBUG oslo_concurrency.lockutils [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.172s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2005.414314] env[60788]: DEBUG oslo_concurrency.lockutils [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2005.414450] env[60788]: DEBUG oslo_concurrency.lockutils [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2005.414568] env[60788]: DEBUG oslo_concurrency.lockutils [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.416700] env[60788]: INFO nova.compute.manager [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Terminating instance [ 2005.418346] env[60788]: DEBUG nova.compute.manager [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2005.418537] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2005.418995] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fb576235-3bb6-453a-b4fd-c6f47f0de6a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.428333] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f917d81-e898-496d-9a53-0a5bba540fbc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2005.454850] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e3671c90-83c7-48f3-8b2a-97f34ab2505e could not be found. [ 2005.455070] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2005.455251] env[60788]: INFO nova.compute.manager [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2005.455487] env[60788]: DEBUG oslo.service.loopingcall [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2005.455937] env[60788]: DEBUG nova.compute.manager [-] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2005.456052] env[60788]: DEBUG nova.network.neutron [-] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2005.477389] env[60788]: DEBUG nova.network.neutron [-] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2005.485231] env[60788]: INFO nova.compute.manager [-] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] Took 0.03 seconds to deallocate network for instance. [ 2005.568615] env[60788]: DEBUG oslo_concurrency.lockutils [None req-69820b6a-8ab5-4780-a626-8bab578ee876 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2005.569468] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 322.817s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2005.569660] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e3671c90-83c7-48f3-8b2a-97f34ab2505e] During sync_power_state the instance has a pending task (deleting). Skip. [ 2005.569833] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "e3671c90-83c7-48f3-8b2a-97f34ab2505e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2047.761858] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2048.753714] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2050.754163] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2050.754593] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2050.754593] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2050.772527] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.772700] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.772802] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.772923] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.773060] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.773188] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.773310] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.773482] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2050.773613] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2051.082973] env[60788]: WARNING oslo_vmware.rw_handles [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2051.082973] env[60788]: ERROR oslo_vmware.rw_handles [ 2051.083463] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2051.085483] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2051.085720] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Copying Virtual Disk [datastore2] vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/018e13d6-8684-474d-8875-45b2bcb5880a/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2051.086009] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6eeb3be5-d97e-450b-954b-07c8251a2198 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.095497] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for the task: (returnval){ [ 2051.095497] env[60788]: value = "task-2205298" [ 2051.095497] env[60788]: _type = "Task" [ 2051.095497] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.103203] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': task-2205298, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2051.605902] env[60788]: DEBUG oslo_vmware.exceptions [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2051.606178] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2051.606720] env[60788]: ERROR nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2051.606720] env[60788]: Faults: ['InvalidArgument'] [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Traceback (most recent call last): [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] yield resources [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self.driver.spawn(context, instance, image_meta, [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self._fetch_image_if_missing(context, vi) [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2051.606720] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] image_cache(vi, tmp_image_ds_loc) [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] vm_util.copy_virtual_disk( [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] session._wait_for_task(vmdk_copy_task) [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] return self.wait_for_task(task_ref) [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] return evt.wait() [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] result = hub.switch() [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] return self.greenlet.switch() [ 2051.607200] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2051.607659] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self.f(*self.args, **self.kw) [ 2051.607659] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2051.607659] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] raise exceptions.translate_fault(task_info.error) [ 2051.607659] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2051.607659] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Faults: ['InvalidArgument'] [ 2051.607659] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] [ 2051.607659] env[60788]: INFO nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Terminating instance [ 2051.608526] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2051.608734] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2051.608965] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba5ac730-022c-429d-bfad-d806cc459c36 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.611088] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2051.611280] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2051.612016] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a592d2f-1350-49dd-a5c1-4aba486315b8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.618550] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2051.618773] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a65ca11b-c618-43e7-85c3-25afb7e3852b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.620751] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2051.620920] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2051.621859] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cdd2e016-8fb4-4872-9fe6-6b97279dda8c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.626525] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Waiting for the task: (returnval){ [ 2051.626525] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52d1f3b9-a221-afe9-dd80-20a0816c25a0" [ 2051.626525] env[60788]: _type = "Task" [ 2051.626525] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.633776] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52d1f3b9-a221-afe9-dd80-20a0816c25a0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2051.682467] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2051.682683] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2051.682896] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Deleting the datastore file [datastore2] e34c6299-ae90-4e5a-b272-3623dfe876c0 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2051.683174] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-585781c1-3113-4599-a572-09cb32bf47bc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.689422] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for the task: (returnval){ [ 2051.689422] env[60788]: value = "task-2205300" [ 2051.689422] env[60788]: _type = "Task" [ 2051.689422] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.696662] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': task-2205300, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2051.753293] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2051.753584] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2052.137218] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2052.137567] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Creating directory with path [datastore2] vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2052.137701] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a5ee75c-a16e-4fba-8150-22d81bb34bfa {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.149037] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Created directory with path [datastore2] vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2052.149239] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Fetch image to [datastore2] vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2052.149409] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2052.150145] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dc395d3-ec14-4fa0-b869-f7ae7016ba46 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.157940] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf2261f3-1c06-41f7-a3f2-8d2a76bc6177 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.166817] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ce02927-3f80-466c-8096-51769362b4ea {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.199489] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-275e6787-9e5a-4ae0-8365-60445518f3fc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.206462] env[60788]: DEBUG oslo_vmware.api [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': task-2205300, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079365} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2052.207926] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2052.208133] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2052.208306] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2052.208475] env[60788]: INFO nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2052.210256] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b684d87c-f34b-4a8e-9286-b585924ea673 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.212461] env[60788]: DEBUG nova.compute.claims [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2052.212461] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2052.212607] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2052.236301] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2052.298200] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2052.361596] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2052.361803] env[60788]: DEBUG oslo_vmware.rw_handles [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2052.413874] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a79809a3-d5f9-437f-beb8-547b34e44644 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.421558] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fca13dbc-29fc-43d6-834c-8059e373700d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.452088] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-519ccd2d-b472-40cf-8775-a08acbe62a4a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.458979] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46255fba-9548-414b-a749-22cf3d7f8f45 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.471903] env[60788]: DEBUG nova.compute.provider_tree [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2052.480303] env[60788]: DEBUG nova.scheduler.client.report [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2052.493984] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2052.494556] env[60788]: ERROR nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2052.494556] env[60788]: Faults: ['InvalidArgument'] [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Traceback (most recent call last): [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self.driver.spawn(context, instance, image_meta, [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self._fetch_image_if_missing(context, vi) [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] image_cache(vi, tmp_image_ds_loc) [ 2052.494556] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] vm_util.copy_virtual_disk( [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] session._wait_for_task(vmdk_copy_task) [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] return self.wait_for_task(task_ref) [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] return evt.wait() [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] result = hub.switch() [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] return self.greenlet.switch() [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2052.494982] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] self.f(*self.args, **self.kw) [ 2052.495393] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2052.495393] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] raise exceptions.translate_fault(task_info.error) [ 2052.495393] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2052.495393] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Faults: ['InvalidArgument'] [ 2052.495393] env[60788]: ERROR nova.compute.manager [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] [ 2052.495393] env[60788]: DEBUG nova.compute.utils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2052.496599] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Build of instance e34c6299-ae90-4e5a-b272-3623dfe876c0 was re-scheduled: A specified parameter was not correct: fileType [ 2052.496599] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2052.496965] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2052.497159] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2052.497329] env[60788]: DEBUG nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2052.497514] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2052.753963] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2052.770351] env[60788]: DEBUG nova.network.neutron [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2052.781451] env[60788]: INFO nova.compute.manager [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Took 0.28 seconds to deallocate network for instance. [ 2052.884885] env[60788]: INFO nova.scheduler.client.report [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Deleted allocations for instance e34c6299-ae90-4e5a-b272-3623dfe876c0 [ 2052.905454] env[60788]: DEBUG oslo_concurrency.lockutils [None req-01a536e8-4d25-4fc2-887e-b91a22ca8928 tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 652.184s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2052.905743] env[60788]: DEBUG oslo_concurrency.lockutils [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 455.630s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2052.905965] env[60788]: DEBUG oslo_concurrency.lockutils [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquiring lock "e34c6299-ae90-4e5a-b272-3623dfe876c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2052.906184] env[60788]: DEBUG oslo_concurrency.lockutils [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2052.906356] env[60788]: DEBUG oslo_concurrency.lockutils [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2052.908618] env[60788]: INFO nova.compute.manager [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Terminating instance [ 2052.910314] env[60788]: DEBUG nova.compute.manager [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2052.910511] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2052.911008] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4a98cdfc-9ceb-4865-9ee6-c3da97ef617c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.920164] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c98b993c-fcbe-4438-b04a-c80d01449bd3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.946984] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e34c6299-ae90-4e5a-b272-3623dfe876c0 could not be found. [ 2052.947207] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2052.947391] env[60788]: INFO nova.compute.manager [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2052.947665] env[60788]: DEBUG oslo.service.loopingcall [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2052.948141] env[60788]: DEBUG nova.compute.manager [-] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2052.948244] env[60788]: DEBUG nova.network.neutron [-] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2052.971736] env[60788]: DEBUG nova.network.neutron [-] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2052.979400] env[60788]: INFO nova.compute.manager [-] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] Took 0.03 seconds to deallocate network for instance. [ 2053.088325] env[60788]: DEBUG oslo_concurrency.lockutils [None req-47c9f36d-fc49-4955-8869-059412296e8c tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.089774] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 370.337s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.090044] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: e34c6299-ae90-4e5a-b272-3623dfe876c0] During sync_power_state the instance has a pending task (deleting). Skip. [ 2053.090280] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "e34c6299-ae90-4e5a-b272-3623dfe876c0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.753309] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2053.753707] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2053.753707] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2053.766135] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.766352] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.766524] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.766682] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2053.767798] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdc83bb5-568c-4df3-bd22-5c4d0571da4c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.776745] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edaee40c-e2bc-43ca-aa12-46952dc8f427 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.790696] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9009ecf-2be8-4639-9ffc-6d3258425480 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.797301] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de3af866-bf0e-4484-a7d4-dc4854cf2e62 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.826150] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181256MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2053.826302] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.826492] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.884876] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885054] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885189] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885313] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885433] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885552] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885669] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2053.885850] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2053.885987] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2053.970998] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e5b58ac-4d04-4438-b273-73dc65dbd580 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.978879] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aa990df-3e4f-429e-8ecb-7f224638ccf0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2054.009320] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2fc2f41-b0f4-4371-82b3-4c5a3277e1e1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2054.016452] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c74c719-59b7-48cf-91c7-5b4bb1da2aac {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2054.029222] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2054.037522] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2054.052531] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2054.052715] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2056.048800] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2058.753712] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2062.343701] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "5aad5755-1a12-45a1-b30c-9a407992ad62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2062.344041] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "5aad5755-1a12-45a1-b30c-9a407992ad62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2062.354208] env[60788]: DEBUG nova.compute.manager [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2062.401923] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2062.402200] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2062.403734] env[60788]: INFO nova.compute.claims [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2062.548596] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05fe70bc-debe-4e83-a7c1-47fe34f90eff {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2062.556489] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba42a6e9-76cf-441c-b811-aa18c49ce17e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2062.587971] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a578fe1-b91d-4941-a782-44500cfb1cd5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2062.595349] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e161f3d-5a6f-4d61-a7dc-abfc9ab53a5f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2062.609642] env[60788]: DEBUG nova.compute.provider_tree [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2062.618540] env[60788]: DEBUG nova.scheduler.client.report [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2062.633024] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2062.633024] env[60788]: DEBUG nova.compute.manager [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2062.666058] env[60788]: DEBUG nova.compute.utils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2062.667907] env[60788]: DEBUG nova.compute.manager [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2062.668126] env[60788]: DEBUG nova.network.neutron [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2062.676389] env[60788]: DEBUG nova.compute.manager [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2062.726372] env[60788]: DEBUG nova.policy [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34d238f3928b4f40813646c9867375c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a80b1c30e829410c9a324f5a4af8c9f7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 2062.739377] env[60788]: DEBUG nova.compute.manager [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2062.764382] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2062.764641] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2062.764799] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2062.764982] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2062.765143] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2062.765291] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2062.765494] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2062.765666] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2062.765856] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2062.766034] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2062.766212] env[60788]: DEBUG nova.virt.hardware [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2062.767322] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-800ad41b-f8ea-4daf-93ca-9ecb7e7e0be0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2062.774917] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae932cba-d39c-4973-986f-c8a30267a155 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.050892] env[60788]: DEBUG nova.network.neutron [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Successfully created port: ff7529e3-fbeb-48cc-99fe-92c4368b0565 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2063.694316] env[60788]: DEBUG nova.compute.manager [req-38505d76-34c3-4087-8b87-f36b7ddee86c req-852984fa-46b6-4d92-bd39-29d5359c6cbf service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Received event network-vif-plugged-ff7529e3-fbeb-48cc-99fe-92c4368b0565 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2063.694637] env[60788]: DEBUG oslo_concurrency.lockutils [req-38505d76-34c3-4087-8b87-f36b7ddee86c req-852984fa-46b6-4d92-bd39-29d5359c6cbf service nova] Acquiring lock "5aad5755-1a12-45a1-b30c-9a407992ad62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2063.694892] env[60788]: DEBUG oslo_concurrency.lockutils [req-38505d76-34c3-4087-8b87-f36b7ddee86c req-852984fa-46b6-4d92-bd39-29d5359c6cbf service nova] Lock "5aad5755-1a12-45a1-b30c-9a407992ad62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2063.695121] env[60788]: DEBUG oslo_concurrency.lockutils [req-38505d76-34c3-4087-8b87-f36b7ddee86c req-852984fa-46b6-4d92-bd39-29d5359c6cbf service nova] Lock "5aad5755-1a12-45a1-b30c-9a407992ad62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2063.695331] env[60788]: DEBUG nova.compute.manager [req-38505d76-34c3-4087-8b87-f36b7ddee86c req-852984fa-46b6-4d92-bd39-29d5359c6cbf service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] No waiting events found dispatching network-vif-plugged-ff7529e3-fbeb-48cc-99fe-92c4368b0565 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2063.695576] env[60788]: WARNING nova.compute.manager [req-38505d76-34c3-4087-8b87-f36b7ddee86c req-852984fa-46b6-4d92-bd39-29d5359c6cbf service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Received unexpected event network-vif-plugged-ff7529e3-fbeb-48cc-99fe-92c4368b0565 for instance with vm_state building and task_state spawning. [ 2063.829156] env[60788]: DEBUG nova.network.neutron [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Successfully updated port: ff7529e3-fbeb-48cc-99fe-92c4368b0565 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2063.843297] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "refresh_cache-5aad5755-1a12-45a1-b30c-9a407992ad62" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2063.843614] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "refresh_cache-5aad5755-1a12-45a1-b30c-9a407992ad62" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2063.843699] env[60788]: DEBUG nova.network.neutron [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2063.890507] env[60788]: DEBUG nova.network.neutron [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2064.050333] env[60788]: DEBUG nova.network.neutron [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Updating instance_info_cache with network_info: [{"id": "ff7529e3-fbeb-48cc-99fe-92c4368b0565", "address": "fa:16:3e:12:bf:70", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff7529e3-fb", "ovs_interfaceid": "ff7529e3-fbeb-48cc-99fe-92c4368b0565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2064.062211] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "refresh_cache-5aad5755-1a12-45a1-b30c-9a407992ad62" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2064.062492] env[60788]: DEBUG nova.compute.manager [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Instance network_info: |[{"id": "ff7529e3-fbeb-48cc-99fe-92c4368b0565", "address": "fa:16:3e:12:bf:70", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff7529e3-fb", "ovs_interfaceid": "ff7529e3-fbeb-48cc-99fe-92c4368b0565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2064.062864] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:12:bf:70', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a4b6ddb2-2e19-4031-9b22-add90d41a114', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ff7529e3-fbeb-48cc-99fe-92c4368b0565', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2064.070757] env[60788]: DEBUG oslo.service.loopingcall [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2064.071188] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2064.071507] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-019bc026-7116-44b3-aaee-8caa4d9a5e36 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2064.092623] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2064.092623] env[60788]: value = "task-2205301" [ 2064.092623] env[60788]: _type = "Task" [ 2064.092623] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2064.100238] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205301, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2064.602983] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205301, 'name': CreateVM_Task, 'duration_secs': 0.276431} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2064.603161] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2064.603784] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2064.603949] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2064.604304] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2064.604585] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1a66d51a-f918-4dab-9589-b2348f67e700 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2064.608647] env[60788]: DEBUG oslo_vmware.api [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 2064.608647] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5208466c-c4d9-1220-9a22-383106cfcac5" [ 2064.608647] env[60788]: _type = "Task" [ 2064.608647] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2064.615753] env[60788]: DEBUG oslo_vmware.api [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5208466c-c4d9-1220-9a22-383106cfcac5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2065.119535] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2065.119921] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2065.119921] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f955a4cc-fc52-4f98-bca0-86b306a62901 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2065.720890] env[60788]: DEBUG nova.compute.manager [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Received event network-changed-ff7529e3-fbeb-48cc-99fe-92c4368b0565 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2065.721055] env[60788]: DEBUG nova.compute.manager [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Refreshing instance network info cache due to event network-changed-ff7529e3-fbeb-48cc-99fe-92c4368b0565. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2065.721274] env[60788]: DEBUG oslo_concurrency.lockutils [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] Acquiring lock "refresh_cache-5aad5755-1a12-45a1-b30c-9a407992ad62" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2065.721445] env[60788]: DEBUG oslo_concurrency.lockutils [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] Acquired lock "refresh_cache-5aad5755-1a12-45a1-b30c-9a407992ad62" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2065.721612] env[60788]: DEBUG nova.network.neutron [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Refreshing network info cache for port ff7529e3-fbeb-48cc-99fe-92c4368b0565 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2066.052080] env[60788]: DEBUG nova.network.neutron [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Updated VIF entry in instance network info cache for port ff7529e3-fbeb-48cc-99fe-92c4368b0565. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2066.052639] env[60788]: DEBUG nova.network.neutron [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Updating instance_info_cache with network_info: [{"id": "ff7529e3-fbeb-48cc-99fe-92c4368b0565", "address": "fa:16:3e:12:bf:70", "network": {"id": "198a3e9b-91bd-4eaf-9da7-a93a2a4d194d", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-305850880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a80b1c30e829410c9a324f5a4af8c9f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4b6ddb2-2e19-4031-9b22-add90d41a114", "external-id": "nsx-vlan-transportzone-921", "segmentation_id": 921, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapff7529e3-fb", "ovs_interfaceid": "ff7529e3-fbeb-48cc-99fe-92c4368b0565", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2066.067643] env[60788]: DEBUG oslo_concurrency.lockutils [req-7f9ecc39-d1ca-4ed9-a5aa-7a6580b03ee9 req-26d10f64-083d-4d1c-9282-01f8e8d70bc6 service nova] Releasing lock "refresh_cache-5aad5755-1a12-45a1-b30c-9a407992ad62" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2101.100419] env[60788]: WARNING oslo_vmware.rw_handles [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2101.100419] env[60788]: ERROR oslo_vmware.rw_handles [ 2101.101295] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2101.102969] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2101.103288] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Copying Virtual Disk [datastore2] vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/c8dd58e2-6fae-4286-9781-313131ecfaea/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2101.103600] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a34c3c70-687f-4436-a9e4-4babdc1d950a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.111958] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Waiting for the task: (returnval){ [ 2101.111958] env[60788]: value = "task-2205302" [ 2101.111958] env[60788]: _type = "Task" [ 2101.111958] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2101.119555] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Task: {'id': task-2205302, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2101.622343] env[60788]: DEBUG oslo_vmware.exceptions [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2101.622640] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2101.623241] env[60788]: ERROR nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2101.623241] env[60788]: Faults: ['InvalidArgument'] [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Traceback (most recent call last): [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] yield resources [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self.driver.spawn(context, instance, image_meta, [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self._fetch_image_if_missing(context, vi) [ 2101.623241] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] image_cache(vi, tmp_image_ds_loc) [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] vm_util.copy_virtual_disk( [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] session._wait_for_task(vmdk_copy_task) [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] return self.wait_for_task(task_ref) [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] return evt.wait() [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] result = hub.switch() [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2101.623717] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] return self.greenlet.switch() [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self.f(*self.args, **self.kw) [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] raise exceptions.translate_fault(task_info.error) [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Faults: ['InvalidArgument'] [ 2101.624134] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] [ 2101.624134] env[60788]: INFO nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Terminating instance [ 2101.625139] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2101.625351] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2101.625594] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15b5c494-525b-4323-8d93-f6cf39b331da {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.629179] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2101.629376] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2101.630091] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f85df483-edc1-4dcb-bfc8-fb44319b0625 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.636491] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2101.636703] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f8b02b8-f20f-4bf1-b487-e42e7bd41d65 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.638852] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2101.639035] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2101.639973] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d98c04f9-16d6-4213-909e-f6bb897ab8c5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.644996] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 2101.644996] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]523a04db-1f13-a7b2-70b8-cdda237b87a0" [ 2101.644996] env[60788]: _type = "Task" [ 2101.644996] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2101.653672] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]523a04db-1f13-a7b2-70b8-cdda237b87a0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2101.703538] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2101.703744] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2101.703919] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Deleting the datastore file [datastore2] ded19ccc-a92f-4d3e-8659-593a1aab1651 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2101.704203] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b2e54f94-3410-434f-a185-e1ddc1a875c2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.709863] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Waiting for the task: (returnval){ [ 2101.709863] env[60788]: value = "task-2205304" [ 2101.709863] env[60788]: _type = "Task" [ 2101.709863] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2101.717553] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Task: {'id': task-2205304, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2102.155623] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2102.155976] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating directory with path [datastore2] vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2102.156123] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-01bc0e7e-2cec-443a-ba84-4618b8479169 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.167487] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Created directory with path [datastore2] vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2102.167674] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Fetch image to [datastore2] vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2102.167907] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2102.168652] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90ba9f46-fce3-4964-a12c-e9c79c7844e8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.174972] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dba43540-41ca-4475-9586-eadfc4040269 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.183840] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d46d58c8-fcbf-4498-b941-3f0ffa71941f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.216619] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69eccbf4-0c1f-400f-b5a8-4a2550b5241d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.223272] env[60788]: DEBUG oslo_vmware.api [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Task: {'id': task-2205304, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.086991} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2102.224689] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2102.224883] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2102.225110] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2102.225296] env[60788]: INFO nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2102.227310] env[60788]: DEBUG nova.compute.claims [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2102.227483] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2102.227694] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2102.230132] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f437a4b1-8e5d-4067-82e2-a7f197dd7ec0 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.251882] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2102.305263] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2102.365144] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2102.365338] env[60788]: DEBUG oslo_vmware.rw_handles [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2102.452937] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c168aa61-6e6c-4b27-910f-e014c8a6512c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.461201] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfd42e5f-e27b-4f12-a228-c01518fbc420 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.490997] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89dbb840-eeed-4b25-8e13-ae8b11eaf091 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.498061] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d44c287-20ed-4d06-a746-400f0f148cae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.510513] env[60788]: DEBUG nova.compute.provider_tree [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2102.521032] env[60788]: DEBUG nova.scheduler.client.report [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2102.533867] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.306s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.534021] env[60788]: ERROR nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2102.534021] env[60788]: Faults: ['InvalidArgument'] [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Traceback (most recent call last): [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self.driver.spawn(context, instance, image_meta, [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self._fetch_image_if_missing(context, vi) [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] image_cache(vi, tmp_image_ds_loc) [ 2102.534021] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] vm_util.copy_virtual_disk( [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] session._wait_for_task(vmdk_copy_task) [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] return self.wait_for_task(task_ref) [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] return evt.wait() [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] result = hub.switch() [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] return self.greenlet.switch() [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2102.534474] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] self.f(*self.args, **self.kw) [ 2102.534926] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2102.534926] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] raise exceptions.translate_fault(task_info.error) [ 2102.534926] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2102.534926] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Faults: ['InvalidArgument'] [ 2102.534926] env[60788]: ERROR nova.compute.manager [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] [ 2102.534926] env[60788]: DEBUG nova.compute.utils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2102.536471] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Build of instance ded19ccc-a92f-4d3e-8659-593a1aab1651 was re-scheduled: A specified parameter was not correct: fileType [ 2102.536471] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2102.536857] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2102.537041] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2102.537215] env[60788]: DEBUG nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2102.537375] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2103.106155] env[60788]: DEBUG nova.network.neutron [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2103.117887] env[60788]: INFO nova.compute.manager [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Took 0.58 seconds to deallocate network for instance. [ 2103.224078] env[60788]: INFO nova.scheduler.client.report [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Deleted allocations for instance ded19ccc-a92f-4d3e-8659-593a1aab1651 [ 2103.243701] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6fb28663-0def-4b80-bc38-7da1f0dcbcc1 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 580.460s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2103.243955] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 420.492s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2103.244173] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] During sync_power_state the instance has a pending task (spawning). Skip. [ 2103.244356] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2103.244572] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 384.700s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2103.244806] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Acquiring lock "ded19ccc-a92f-4d3e-8659-593a1aab1651-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2103.244977] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2103.245158] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2103.247010] env[60788]: INFO nova.compute.manager [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Terminating instance [ 2103.249778] env[60788]: DEBUG nova.compute.manager [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2103.249973] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2103.250245] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-34fbccdd-2ca1-4fa6-b39d-912e7f66a0c7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.261548] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13aeb19c-67c0-444b-99cd-a9f47223d7e5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.289043] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ded19ccc-a92f-4d3e-8659-593a1aab1651 could not be found. [ 2103.289201] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2103.289382] env[60788]: INFO nova.compute.manager [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2103.289618] env[60788]: DEBUG oslo.service.loopingcall [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2103.289840] env[60788]: DEBUG nova.compute.manager [-] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2103.289938] env[60788]: DEBUG nova.network.neutron [-] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2103.312238] env[60788]: DEBUG nova.network.neutron [-] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2103.320869] env[60788]: INFO nova.compute.manager [-] [instance: ded19ccc-a92f-4d3e-8659-593a1aab1651] Took 0.03 seconds to deallocate network for instance. [ 2103.422924] env[60788]: DEBUG oslo_concurrency.lockutils [None req-5208995c-0785-4aa8-824c-ae5020e34677 tempest-ServerPasswordTestJSON-1970156309 tempest-ServerPasswordTestJSON-1970156309-project-member] Lock "ded19ccc-a92f-4d3e-8659-593a1aab1651" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.178s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2110.753416] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2112.754156] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2112.754519] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2112.754519] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2112.776251] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.776742] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.776957] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.777114] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.777250] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.777506] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.777583] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2112.777705] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2112.778368] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2113.753980] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2113.754184] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2113.754452] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2113.765961] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2113.766185] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2113.766361] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2113.766518] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2113.767628] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3678e25-0482-4901-b526-c662380595ed {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.776673] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66d8e3d0-4d4d-4fc0-91e5-ed87315adba7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.790334] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51986165-694b-43b8-8a2d-0bf6caf25b86 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.796555] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2497a1e5-b1e8-40ce-bc13-36e20a9d4de6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.826531] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181269MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2113.826683] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2113.826853] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2113.898506] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.898771] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.898959] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.899155] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.899318] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.899483] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.899644] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5aad5755-1a12-45a1-b30c-9a407992ad62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2113.899889] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2113.900078] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2113.998346] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4368a41-83b0-4780-a829-ef2830ac7c6d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2114.006144] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-368e853c-cb93-4987-b3b2-e964db738bcf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2114.035320] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-134561f8-9b57-4857-a863-1d0835889092 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2114.042266] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be109917-442f-41c1-bcb5-6276292d1e30 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2114.057626] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2114.065826] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2114.079381] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2114.079565] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2115.080675] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2115.080921] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2117.539137] env[60788]: DEBUG oslo_concurrency.lockutils [None req-beb82b31-491f-4854-b110-ad0cd395b58e tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "c4697916-5d18-4d2b-9e12-91801de44580" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2117.749473] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2119.196201] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "432ef65e-4072-44d3-81c5-9371aacbb1c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.196492] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Lock "432ef65e-4072-44d3-81c5-9371aacbb1c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.209582] env[60788]: DEBUG nova.compute.manager [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2119.261655] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.261907] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.263417] env[60788]: INFO nova.compute.claims [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2119.418254] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813e9047-20d8-46cb-bbd8-9d4ce9e2cefe {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.427112] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37840d74-5272-40b7-93d4-56a94f48fe40 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.459287] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-623e3ed1-c046-4ab1-a91d-5ff3b4db974e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.466888] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd986d8c-0a1a-46ea-ac8f-e6d2d01587f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.480327] env[60788]: DEBUG nova.compute.provider_tree [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2119.490171] env[60788]: DEBUG nova.scheduler.client.report [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2119.504650] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2119.505156] env[60788]: DEBUG nova.compute.manager [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2119.536824] env[60788]: DEBUG nova.compute.utils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2119.539502] env[60788]: DEBUG nova.compute.manager [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2119.539502] env[60788]: DEBUG nova.network.neutron [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2119.547647] env[60788]: DEBUG nova.compute.manager [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2119.624149] env[60788]: DEBUG nova.compute.manager [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2119.676284] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2119.676546] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2119.676697] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2119.676889] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2119.677033] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2119.677190] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2119.677413] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2119.677576] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2119.677742] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2119.677900] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2119.678090] env[60788]: DEBUG nova.virt.hardware [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2119.678962] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8edffce6-a0f4-4534-b021-9ebc63c54cdd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.687364] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-909dafa4-f228-4f74-a948-62f0d200e424 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.706243] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "66228017-46bd-4709-8771-6a1947f8a643" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.706587] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Lock "66228017-46bd-4709-8771-6a1947f8a643" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.717769] env[60788]: DEBUG nova.compute.manager [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2119.737105] env[60788]: DEBUG nova.policy [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0832ae79e7d14a68a2242350d8c1979b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '36971c7f0bbf4e0ea014e36acbbcbdd0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 2119.780989] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.781294] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.782702] env[60788]: INFO nova.compute.claims [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2119.963150] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1265699f-6f11-45c3-8b28-59e718b2a76f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.971076] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18cdbace-79ef-4281-b057-570150c044f9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.000414] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92d4c3fb-505e-4511-b78b-2ff7775042f4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.007407] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bcaa572-3886-4e03-a516-355976a87a4f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.020098] env[60788]: DEBUG nova.compute.provider_tree [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2120.031667] env[60788]: DEBUG nova.scheduler.client.report [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2120.044963] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2120.045429] env[60788]: DEBUG nova.compute.manager [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2120.086742] env[60788]: DEBUG nova.compute.utils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2120.087924] env[60788]: DEBUG nova.compute.manager [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2120.088108] env[60788]: DEBUG nova.network.neutron [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2120.096679] env[60788]: DEBUG nova.compute.manager [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2120.162289] env[60788]: DEBUG nova.compute.manager [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2120.188344] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2120.188593] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2120.188750] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2120.188931] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2120.189103] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2120.189258] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2120.189470] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2120.189631] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2120.189859] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2120.190084] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2120.190271] env[60788]: DEBUG nova.virt.hardware [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2120.191133] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-673eddc8-9d47-448e-b8b8-62792ccb5f59 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.196567] env[60788]: DEBUG nova.policy [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0832ae79e7d14a68a2242350d8c1979b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '36971c7f0bbf4e0ea014e36acbbcbdd0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 2120.204025] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-356274ad-599f-436c-9c69-f48cc03805cf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.272223] env[60788]: DEBUG nova.network.neutron [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Successfully created port: bd6c52ae-daea-432c-838a-650b6116eb41 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2120.756583] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2120.814276] env[60788]: DEBUG nova.network.neutron [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Successfully created port: cc97a428-56a0-4567-afb8-c4a03651dfe5 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2121.322826] env[60788]: DEBUG nova.compute.manager [req-731c7afd-ba2f-4397-b093-50da59e6ac21 req-92bfbfc7-ad65-4b3b-97d2-f41cf42268dc service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Received event network-vif-plugged-bd6c52ae-daea-432c-838a-650b6116eb41 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2121.323130] env[60788]: DEBUG oslo_concurrency.lockutils [req-731c7afd-ba2f-4397-b093-50da59e6ac21 req-92bfbfc7-ad65-4b3b-97d2-f41cf42268dc service nova] Acquiring lock "432ef65e-4072-44d3-81c5-9371aacbb1c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2121.323327] env[60788]: DEBUG oslo_concurrency.lockutils [req-731c7afd-ba2f-4397-b093-50da59e6ac21 req-92bfbfc7-ad65-4b3b-97d2-f41cf42268dc service nova] Lock "432ef65e-4072-44d3-81c5-9371aacbb1c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2121.323497] env[60788]: DEBUG oslo_concurrency.lockutils [req-731c7afd-ba2f-4397-b093-50da59e6ac21 req-92bfbfc7-ad65-4b3b-97d2-f41cf42268dc service nova] Lock "432ef65e-4072-44d3-81c5-9371aacbb1c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2121.323658] env[60788]: DEBUG nova.compute.manager [req-731c7afd-ba2f-4397-b093-50da59e6ac21 req-92bfbfc7-ad65-4b3b-97d2-f41cf42268dc service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] No waiting events found dispatching network-vif-plugged-bd6c52ae-daea-432c-838a-650b6116eb41 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2121.323830] env[60788]: WARNING nova.compute.manager [req-731c7afd-ba2f-4397-b093-50da59e6ac21 req-92bfbfc7-ad65-4b3b-97d2-f41cf42268dc service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Received unexpected event network-vif-plugged-bd6c52ae-daea-432c-838a-650b6116eb41 for instance with vm_state building and task_state spawning. [ 2121.423315] env[60788]: DEBUG nova.network.neutron [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Successfully updated port: bd6c52ae-daea-432c-838a-650b6116eb41 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2121.434710] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "refresh_cache-432ef65e-4072-44d3-81c5-9371aacbb1c2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2121.434859] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquired lock "refresh_cache-432ef65e-4072-44d3-81c5-9371aacbb1c2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2121.435015] env[60788]: DEBUG nova.network.neutron [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2121.502611] env[60788]: DEBUG nova.network.neutron [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2121.926473] env[60788]: DEBUG nova.network.neutron [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Updating instance_info_cache with network_info: [{"id": "bd6c52ae-daea-432c-838a-650b6116eb41", "address": "fa:16:3e:a1:3f:54", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd6c52ae-da", "ovs_interfaceid": "bd6c52ae-daea-432c-838a-650b6116eb41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2121.941239] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Releasing lock "refresh_cache-432ef65e-4072-44d3-81c5-9371aacbb1c2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2121.941586] env[60788]: DEBUG nova.compute.manager [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Instance network_info: |[{"id": "bd6c52ae-daea-432c-838a-650b6116eb41", "address": "fa:16:3e:a1:3f:54", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd6c52ae-da", "ovs_interfaceid": "bd6c52ae-daea-432c-838a-650b6116eb41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2121.942307] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a1:3f:54', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bd6c52ae-daea-432c-838a-650b6116eb41', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2121.949803] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Creating folder: Project (36971c7f0bbf4e0ea014e36acbbcbdd0). Parent ref: group-v449747. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2121.950673] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e1039fa-2c01-4d4d-b585-e2b873720571 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.961719] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Created folder: Project (36971c7f0bbf4e0ea014e36acbbcbdd0) in parent group-v449747. [ 2121.961901] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Creating folder: Instances. Parent ref: group-v449850. {{(pid=60788) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2121.962149] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-26affc69-8cfb-417e-b6a3-daa9943b86c1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.971061] env[60788]: INFO nova.virt.vmwareapi.vm_util [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Created folder: Instances in parent group-v449850. [ 2121.971283] env[60788]: DEBUG oslo.service.loopingcall [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2121.971470] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2121.971663] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-681bd3ff-97a4-4221-84f2-b7dbc94e9073 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.990488] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2121.990488] env[60788]: value = "task-2205307" [ 2121.990488] env[60788]: _type = "Task" [ 2121.990488] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2121.997819] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205307, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2122.047411] env[60788]: DEBUG nova.network.neutron [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Successfully updated port: cc97a428-56a0-4567-afb8-c4a03651dfe5 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2122.056270] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "refresh_cache-66228017-46bd-4709-8771-6a1947f8a643" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2122.056488] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquired lock "refresh_cache-66228017-46bd-4709-8771-6a1947f8a643" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2122.056739] env[60788]: DEBUG nova.network.neutron [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2122.132235] env[60788]: DEBUG nova.network.neutron [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2122.500330] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205307, 'name': CreateVM_Task, 'duration_secs': 0.280834} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2122.500647] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2122.501176] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2122.501352] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2122.501710] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2122.501959] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a6fae4a8-fa9e-4230-affb-c16663a4897c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2122.506034] env[60788]: DEBUG oslo_vmware.api [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Waiting for the task: (returnval){ [ 2122.506034] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5283d760-e802-eab0-89dd-8f74ea1fe4ab" [ 2122.506034] env[60788]: _type = "Task" [ 2122.506034] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2122.513252] env[60788]: DEBUG oslo_vmware.api [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5283d760-e802-eab0-89dd-8f74ea1fe4ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2122.520517] env[60788]: DEBUG nova.network.neutron [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Updating instance_info_cache with network_info: [{"id": "cc97a428-56a0-4567-afb8-c4a03651dfe5", "address": "fa:16:3e:2c:b7:9b", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc97a428-56", "ovs_interfaceid": "cc97a428-56a0-4567-afb8-c4a03651dfe5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2122.531911] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Releasing lock "refresh_cache-66228017-46bd-4709-8771-6a1947f8a643" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2122.532222] env[60788]: DEBUG nova.compute.manager [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Instance network_info: |[{"id": "cc97a428-56a0-4567-afb8-c4a03651dfe5", "address": "fa:16:3e:2c:b7:9b", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc97a428-56", "ovs_interfaceid": "cc97a428-56a0-4567-afb8-c4a03651dfe5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2122.532591] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:b7:9b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1bf71001-973b-4fda-b804-ee6abcd12776', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cc97a428-56a0-4567-afb8-c4a03651dfe5', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2122.540101] env[60788]: DEBUG oslo.service.loopingcall [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2122.540522] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2122.540880] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-06815330-d66a-477d-9892-81bb0f00b842 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2122.560749] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2122.560749] env[60788]: value = "task-2205308" [ 2122.560749] env[60788]: _type = "Task" [ 2122.560749] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2122.568545] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205308, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2123.016548] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2123.016808] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2123.017037] env[60788]: DEBUG oslo_concurrency.lockutils [None req-803a58db-7391-4b8f-b3c3-d1a838187e10 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2123.070489] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205308, 'name': CreateVM_Task, 'duration_secs': 0.26883} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2123.070660] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2123.071355] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2123.071532] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2123.071826] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2123.072082] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1de5fa0c-0b7f-4c9b-b55a-490af6040e7c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2123.076331] env[60788]: DEBUG oslo_vmware.api [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Waiting for the task: (returnval){ [ 2123.076331] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]521f9ce7-0784-7680-4b17-89b1550cd705" [ 2123.076331] env[60788]: _type = "Task" [ 2123.076331] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2123.084925] env[60788]: DEBUG oslo_vmware.api [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]521f9ce7-0784-7680-4b17-89b1550cd705, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2123.352568] env[60788]: DEBUG nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Received event network-changed-bd6c52ae-daea-432c-838a-650b6116eb41 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2123.352767] env[60788]: DEBUG nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Refreshing instance network info cache due to event network-changed-bd6c52ae-daea-432c-838a-650b6116eb41. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2123.352965] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Acquiring lock "refresh_cache-432ef65e-4072-44d3-81c5-9371aacbb1c2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2123.353211] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Acquired lock "refresh_cache-432ef65e-4072-44d3-81c5-9371aacbb1c2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2123.353389] env[60788]: DEBUG nova.network.neutron [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Refreshing network info cache for port bd6c52ae-daea-432c-838a-650b6116eb41 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2123.588075] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2123.588381] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2123.588633] env[60788]: DEBUG oslo_concurrency.lockutils [None req-1a946f76-238e-427a-adfb-64f88273c6e5 tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2123.672529] env[60788]: DEBUG nova.network.neutron [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Updated VIF entry in instance network info cache for port bd6c52ae-daea-432c-838a-650b6116eb41. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2123.672877] env[60788]: DEBUG nova.network.neutron [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Updating instance_info_cache with network_info: [{"id": "bd6c52ae-daea-432c-838a-650b6116eb41", "address": "fa:16:3e:a1:3f:54", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.143", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbd6c52ae-da", "ovs_interfaceid": "bd6c52ae-daea-432c-838a-650b6116eb41", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2123.681590] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Releasing lock "refresh_cache-432ef65e-4072-44d3-81c5-9371aacbb1c2" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2123.681822] env[60788]: DEBUG nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Received event network-vif-plugged-cc97a428-56a0-4567-afb8-c4a03651dfe5 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2123.682016] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Acquiring lock "66228017-46bd-4709-8771-6a1947f8a643-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2123.682225] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Lock "66228017-46bd-4709-8771-6a1947f8a643-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2123.682387] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Lock "66228017-46bd-4709-8771-6a1947f8a643-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2123.682572] env[60788]: DEBUG nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] No waiting events found dispatching network-vif-plugged-cc97a428-56a0-4567-afb8-c4a03651dfe5 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2123.682791] env[60788]: WARNING nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Received unexpected event network-vif-plugged-cc97a428-56a0-4567-afb8-c4a03651dfe5 for instance with vm_state building and task_state spawning. [ 2123.682967] env[60788]: DEBUG nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Received event network-changed-cc97a428-56a0-4567-afb8-c4a03651dfe5 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2123.683164] env[60788]: DEBUG nova.compute.manager [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Refreshing instance network info cache due to event network-changed-cc97a428-56a0-4567-afb8-c4a03651dfe5. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2123.683358] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Acquiring lock "refresh_cache-66228017-46bd-4709-8771-6a1947f8a643" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2123.683498] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Acquired lock "refresh_cache-66228017-46bd-4709-8771-6a1947f8a643" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2123.683653] env[60788]: DEBUG nova.network.neutron [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Refreshing network info cache for port cc97a428-56a0-4567-afb8-c4a03651dfe5 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2123.908698] env[60788]: DEBUG nova.network.neutron [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Updated VIF entry in instance network info cache for port cc97a428-56a0-4567-afb8-c4a03651dfe5. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2123.909078] env[60788]: DEBUG nova.network.neutron [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Updating instance_info_cache with network_info: [{"id": "cc97a428-56a0-4567-afb8-c4a03651dfe5", "address": "fa:16:3e:2c:b7:9b", "network": {"id": "607a5b5f-e457-42fc-bfb4-959832d584d2", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "557dda57b95d4800898c4a941c455b73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1bf71001-973b-4fda-b804-ee6abcd12776", "external-id": "nsx-vlan-transportzone-498", "segmentation_id": 498, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcc97a428-56", "ovs_interfaceid": "cc97a428-56a0-4567-afb8-c4a03651dfe5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2123.918872] env[60788]: DEBUG oslo_concurrency.lockutils [req-b04faa59-0b2a-4c09-897f-87e5e48a474c req-81548bb5-c55b-43da-97ae-9018e9d1eba6 service nova] Releasing lock "refresh_cache-66228017-46bd-4709-8771-6a1947f8a643" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2152.214022] env[60788]: WARNING oslo_vmware.rw_handles [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2152.214022] env[60788]: ERROR oslo_vmware.rw_handles [ 2152.214022] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2152.215809] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2152.216087] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Copying Virtual Disk [datastore2] vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/9be4177e-4e05-4ebf-9114-2c289349a30c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2152.216384] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-08e66dcf-b456-43f6-a482-cb058e4018bb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.223842] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 2152.223842] env[60788]: value = "task-2205309" [ 2152.223842] env[60788]: _type = "Task" [ 2152.223842] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2152.231434] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205309, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2152.734832] env[60788]: DEBUG oslo_vmware.exceptions [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2152.735231] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2152.735816] env[60788]: ERROR nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2152.735816] env[60788]: Faults: ['InvalidArgument'] [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Traceback (most recent call last): [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] yield resources [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self.driver.spawn(context, instance, image_meta, [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self._fetch_image_if_missing(context, vi) [ 2152.735816] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] image_cache(vi, tmp_image_ds_loc) [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] vm_util.copy_virtual_disk( [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] session._wait_for_task(vmdk_copy_task) [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] return self.wait_for_task(task_ref) [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] return evt.wait() [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] result = hub.switch() [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2152.736191] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] return self.greenlet.switch() [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self.f(*self.args, **self.kw) [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] raise exceptions.translate_fault(task_info.error) [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Faults: ['InvalidArgument'] [ 2152.736571] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] [ 2152.736571] env[60788]: INFO nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Terminating instance [ 2152.738423] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2152.738423] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2152.738423] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2b9a3089-6329-44e0-969c-948cbcacfc54 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.740346] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2152.740541] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2152.741272] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8620f79-7fd9-4b50-893d-e7fb958ab000 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.747981] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2152.748211] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e34e7a65-95a3-417c-b722-e68b98587d1b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.750373] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2152.750549] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2152.751522] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f7b743a-847a-4ae7-99e2-9966936c7880 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.756161] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 2152.756161] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]525286f1-e2d7-7180-396e-2eeffa683bd7" [ 2152.756161] env[60788]: _type = "Task" [ 2152.756161] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2152.769092] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2152.769322] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating directory with path [datastore2] vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2152.769530] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb051c34-1692-41df-ab1a-5320ef0c0de2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.788324] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Created directory with path [datastore2] vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2152.788515] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Fetch image to [datastore2] vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2152.788683] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2152.789491] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d13c3874-de82-4f54-a7dc-58e697b11fc9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.796260] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24ff3298-4629-49a2-b8df-fff2f40d49ee {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.806490] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f855977-ddf6-4b3a-a30c-124f6e0cf260 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.812328] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2152.812556] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2152.812752] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleting the datastore file [datastore2] 7cc29f7d-e708-44a9-8ab6-5204163e9c96 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2152.813014] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ec45818-d6ad-41c3-8343-9ef236a21863 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.839991] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bebbb89-a7a4-428f-8728-713fb62a865f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.843332] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for the task: (returnval){ [ 2152.843332] env[60788]: value = "task-2205311" [ 2152.843332] env[60788]: _type = "Task" [ 2152.843332] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2152.848308] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9d354288-0822-4a27-8e58-fc9e47b5055b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.852529] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205311, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2152.878090] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2152.928525] env[60788]: DEBUG oslo_vmware.rw_handles [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2152.988954] env[60788]: DEBUG oslo_vmware.rw_handles [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2152.989172] env[60788]: DEBUG oslo_vmware.rw_handles [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2153.355118] env[60788]: DEBUG oslo_vmware.api [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Task: {'id': task-2205311, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06516} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2153.355480] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2153.355623] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2153.355832] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2153.356053] env[60788]: INFO nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2153.358151] env[60788]: DEBUG nova.compute.claims [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2153.358376] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2153.358704] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2153.508440] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddfd490f-d281-4ff7-a057-076ba2193c45 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.515819] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f096960-265c-4aa9-9503-f4c9974d2439 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.546405] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7586c96-d98f-4d1e-a22e-911d284e9f81 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.553016] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc34bc20-0d9f-467d-ab8c-9025113d1e61 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.565797] env[60788]: DEBUG nova.compute.provider_tree [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2153.574029] env[60788]: DEBUG nova.scheduler.client.report [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2153.587919] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.229s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.588435] env[60788]: ERROR nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2153.588435] env[60788]: Faults: ['InvalidArgument'] [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Traceback (most recent call last): [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self.driver.spawn(context, instance, image_meta, [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self._fetch_image_if_missing(context, vi) [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] image_cache(vi, tmp_image_ds_loc) [ 2153.588435] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] vm_util.copy_virtual_disk( [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] session._wait_for_task(vmdk_copy_task) [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] return self.wait_for_task(task_ref) [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] return evt.wait() [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] result = hub.switch() [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] return self.greenlet.switch() [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2153.588887] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] self.f(*self.args, **self.kw) [ 2153.589369] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2153.589369] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] raise exceptions.translate_fault(task_info.error) [ 2153.589369] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2153.589369] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Faults: ['InvalidArgument'] [ 2153.589369] env[60788]: ERROR nova.compute.manager [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] [ 2153.589369] env[60788]: DEBUG nova.compute.utils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2153.590411] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Build of instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 was re-scheduled: A specified parameter was not correct: fileType [ 2153.590411] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2153.590784] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2153.590953] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2153.591141] env[60788]: DEBUG nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2153.591303] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2153.893950] env[60788]: DEBUG nova.network.neutron [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2153.921969] env[60788]: INFO nova.compute.manager [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Took 0.33 seconds to deallocate network for instance. [ 2154.009108] env[60788]: INFO nova.scheduler.client.report [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Deleted allocations for instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 [ 2154.028994] env[60788]: DEBUG oslo_concurrency.lockutils [None req-4100ddea-a811-4c5e-838b-ca1de6c8960f tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.179s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.029289] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 471.277s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.029483] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] During sync_power_state the instance has a pending task (spawning). Skip. [ 2154.029658] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.030145] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 429.599s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.030372] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Acquiring lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2154.030572] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.030735] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.032685] env[60788]: INFO nova.compute.manager [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Terminating instance [ 2154.034512] env[60788]: DEBUG nova.compute.manager [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2154.034633] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2154.035075] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5a0118c8-9b24-4540-97ee-c46245ec718c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2154.044578] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46d7182a-6435-447d-a4f9-ee5df96ac919 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2154.072903] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7cc29f7d-e708-44a9-8ab6-5204163e9c96 could not be found. [ 2154.073143] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2154.073336] env[60788]: INFO nova.compute.manager [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2154.073575] env[60788]: DEBUG oslo.service.loopingcall [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2154.073789] env[60788]: DEBUG nova.compute.manager [-] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2154.073888] env[60788]: DEBUG nova.network.neutron [-] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2154.117598] env[60788]: DEBUG nova.network.neutron [-] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2154.126823] env[60788]: INFO nova.compute.manager [-] [instance: 7cc29f7d-e708-44a9-8ab6-5204163e9c96] Took 0.05 seconds to deallocate network for instance. [ 2154.225357] env[60788]: DEBUG oslo_concurrency.lockutils [None req-6e22cbb4-f682-401b-a961-c4bb8fe1904c tempest-ServerDiskConfigTestJSON-1540893514 tempest-ServerDiskConfigTestJSON-1540893514-project-member] Lock "7cc29f7d-e708-44a9-8ab6-5204163e9c96" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2166.510723] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "2324cad4-a7e4-429a-8503-24e6b6f90033" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2166.511040] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "2324cad4-a7e4-429a-8503-24e6b6f90033" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2166.523517] env[60788]: DEBUG nova.compute.manager [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Starting instance... {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2407}} [ 2166.567855] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2166.568145] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2166.569616] env[60788]: INFO nova.compute.claims [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2166.721988] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf4186a1-5548-4a46-80cc-bdf933018535 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.729629] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef8775c8-1c4a-4799-83ad-fb4cc5f3cbcd {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.759788] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-affae846-7045-4ca7-85e8-b8a83897f1da {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.766340] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbf3ead2-0eea-4c5b-a5b1-47265ca0f944 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.779303] env[60788]: DEBUG nova.compute.provider_tree [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2166.787617] env[60788]: DEBUG nova.scheduler.client.report [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2166.800507] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2166.800969] env[60788]: DEBUG nova.compute.manager [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Start building networks asynchronously for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2804}} [ 2166.831477] env[60788]: DEBUG nova.compute.utils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Using /dev/sd instead of None {{(pid=60788) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2166.832636] env[60788]: DEBUG nova.compute.manager [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Allocating IP information in the background. {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} [ 2166.832814] env[60788]: DEBUG nova.network.neutron [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] allocate_for_instance() {{(pid=60788) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2166.843922] env[60788]: DEBUG nova.compute.manager [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Start building block device mappings for instance. {{(pid=60788) _build_resources /opt/stack/nova/nova/compute/manager.py:2839}} [ 2166.897572] env[60788]: DEBUG nova.policy [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '571aaecebbc249e3ae4d9306e1e109ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e80c355190594f5a960ca2d14c3f010c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60788) authorize /opt/stack/nova/nova/policy.py:203}} [ 2166.914919] env[60788]: DEBUG nova.compute.manager [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Start spawning the instance on the hypervisor. {{(pid=60788) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2613}} [ 2166.962317] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-03T12:04:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-03T12:04:36Z,direct_url=,disk_format='vmdk',id=1d9d6f6c-1335-48c8-9690-b6c8e781cb21,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='557dda57b95d4800898c4a941c455b73',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-03T12:04:37Z,virtual_size=,visibility=), allow threads: False {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2166.962558] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Flavor limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2166.962711] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Image limits 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2166.962888] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Flavor pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2166.963110] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Image pref 0:0:0 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2166.963277] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60788) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2166.963484] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2166.963644] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2166.963818] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Got 1 possible topologies {{(pid=60788) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2166.963985] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2166.964211] env[60788]: DEBUG nova.virt.hardware [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60788) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2166.965070] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff1279fb-3218-4046-a7df-fe92e3a9155c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.973117] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a97be04d-aa1f-4460-a58a-6938f8e6bfa2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.342236] env[60788]: DEBUG nova.network.neutron [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Successfully created port: a73c60fb-585d-4027-a461-16b5a43c29b0 {{(pid=60788) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2167.749533] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2168.173197] env[60788]: DEBUG nova.compute.manager [req-c0ef5537-c394-43f2-a251-4a678624c200 req-44d2d70c-56e8-4e12-947c-a7574f1f0649 service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Received event network-vif-plugged-a73c60fb-585d-4027-a461-16b5a43c29b0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2168.173476] env[60788]: DEBUG oslo_concurrency.lockutils [req-c0ef5537-c394-43f2-a251-4a678624c200 req-44d2d70c-56e8-4e12-947c-a7574f1f0649 service nova] Acquiring lock "2324cad4-a7e4-429a-8503-24e6b6f90033-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2168.173643] env[60788]: DEBUG oslo_concurrency.lockutils [req-c0ef5537-c394-43f2-a251-4a678624c200 req-44d2d70c-56e8-4e12-947c-a7574f1f0649 service nova] Lock "2324cad4-a7e4-429a-8503-24e6b6f90033-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2168.173828] env[60788]: DEBUG oslo_concurrency.lockutils [req-c0ef5537-c394-43f2-a251-4a678624c200 req-44d2d70c-56e8-4e12-947c-a7574f1f0649 service nova] Lock "2324cad4-a7e4-429a-8503-24e6b6f90033-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2168.173977] env[60788]: DEBUG nova.compute.manager [req-c0ef5537-c394-43f2-a251-4a678624c200 req-44d2d70c-56e8-4e12-947c-a7574f1f0649 service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] No waiting events found dispatching network-vif-plugged-a73c60fb-585d-4027-a461-16b5a43c29b0 {{(pid=60788) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2168.175512] env[60788]: WARNING nova.compute.manager [req-c0ef5537-c394-43f2-a251-4a678624c200 req-44d2d70c-56e8-4e12-947c-a7574f1f0649 service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Received unexpected event network-vif-plugged-a73c60fb-585d-4027-a461-16b5a43c29b0 for instance with vm_state building and task_state spawning. [ 2168.284612] env[60788]: DEBUG nova.network.neutron [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Successfully updated port: a73c60fb-585d-4027-a461-16b5a43c29b0 {{(pid=60788) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2168.301423] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "refresh_cache-2324cad4-a7e4-429a-8503-24e6b6f90033" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2168.301575] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "refresh_cache-2324cad4-a7e4-429a-8503-24e6b6f90033" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2168.301725] env[60788]: DEBUG nova.network.neutron [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Building network info cache for instance {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2168.357729] env[60788]: DEBUG nova.network.neutron [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Instance cache missing network info. {{(pid=60788) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2168.532403] env[60788]: DEBUG nova.network.neutron [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Updating instance_info_cache with network_info: [{"id": "a73c60fb-585d-4027-a461-16b5a43c29b0", "address": "fa:16:3e:24:73:ef", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa73c60fb-58", "ovs_interfaceid": "a73c60fb-585d-4027-a461-16b5a43c29b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2168.544511] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "refresh_cache-2324cad4-a7e4-429a-8503-24e6b6f90033" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2168.544922] env[60788]: DEBUG nova.compute.manager [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Instance network_info: |[{"id": "a73c60fb-585d-4027-a461-16b5a43c29b0", "address": "fa:16:3e:24:73:ef", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa73c60fb-58", "ovs_interfaceid": "a73c60fb-585d-4027-a461-16b5a43c29b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60788) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} [ 2168.545191] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:73:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3f4a795c-8718-4a7c-aafe-9da231df10f8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a73c60fb-585d-4027-a461-16b5a43c29b0', 'vif_model': 'vmxnet3'}] {{(pid=60788) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2168.552779] env[60788]: DEBUG oslo.service.loopingcall [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2168.553264] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Creating VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2168.553600] env[60788]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3bb799c7-6a71-49c8-9219-7eff65190908 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.575464] env[60788]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2168.575464] env[60788]: value = "task-2205312" [ 2168.575464] env[60788]: _type = "Task" [ 2168.575464] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2168.586137] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205312, 'name': CreateVM_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2169.085206] env[60788]: DEBUG oslo_vmware.api [-] Task: {'id': task-2205312, 'name': CreateVM_Task, 'duration_secs': 0.284323} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2169.085552] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Created VM on the ESX host {{(pid=60788) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2169.085993] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2169.086179] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2169.086490] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2169.086731] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-22850e63-2cc1-41a0-accf-943026bb9875 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.090781] env[60788]: DEBUG oslo_vmware.api [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 2169.090781] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52701df2-a76e-082c-c750-03923aeef6af" [ 2169.090781] env[60788]: _type = "Task" [ 2169.090781] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2169.097922] env[60788]: DEBUG oslo_vmware.api [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]52701df2-a76e-082c-c750-03923aeef6af, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2169.601990] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2169.602248] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Processing image 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2169.602470] env[60788]: DEBUG oslo_concurrency.lockutils [None req-54d707a4-2cac-4995-82c8-be73ec1364c1 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2170.200769] env[60788]: DEBUG nova.compute.manager [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Received event network-changed-a73c60fb-585d-4027-a461-16b5a43c29b0 {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11102}} [ 2170.201048] env[60788]: DEBUG nova.compute.manager [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Refreshing instance network info cache due to event network-changed-a73c60fb-585d-4027-a461-16b5a43c29b0. {{(pid=60788) external_instance_event /opt/stack/nova/nova/compute/manager.py:11107}} [ 2170.201231] env[60788]: DEBUG oslo_concurrency.lockutils [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] Acquiring lock "refresh_cache-2324cad4-a7e4-429a-8503-24e6b6f90033" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2170.201383] env[60788]: DEBUG oslo_concurrency.lockutils [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] Acquired lock "refresh_cache-2324cad4-a7e4-429a-8503-24e6b6f90033" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2170.201546] env[60788]: DEBUG nova.network.neutron [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Refreshing network info cache for port a73c60fb-585d-4027-a461-16b5a43c29b0 {{(pid=60788) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2170.433013] env[60788]: DEBUG nova.network.neutron [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Updated VIF entry in instance network info cache for port a73c60fb-585d-4027-a461-16b5a43c29b0. {{(pid=60788) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2170.433410] env[60788]: DEBUG nova.network.neutron [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Updating instance_info_cache with network_info: [{"id": "a73c60fb-585d-4027-a461-16b5a43c29b0", "address": "fa:16:3e:24:73:ef", "network": {"id": "581246d4-8e9b-43d9-b1a9-1bce99840a2b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1525450422-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e80c355190594f5a960ca2d14c3f010c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3f4a795c-8718-4a7c-aafe-9da231df10f8", "external-id": "nsx-vlan-transportzone-162", "segmentation_id": 162, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa73c60fb-58", "ovs_interfaceid": "a73c60fb-585d-4027-a461-16b5a43c29b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2170.442546] env[60788]: DEBUG oslo_concurrency.lockutils [req-e3223e9e-4c8b-4e6c-aa7e-f42161af7ebe req-a225c3f2-7ec7-4eb7-af23-b9a4a74c856d service nova] Releasing lock "refresh_cache-2324cad4-a7e4-429a-8503-24e6b6f90033" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2171.753704] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.753642] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2172.753873] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2172.754169] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2172.775365] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.775534] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.775651] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.775777] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.775900] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.776035] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.776165] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.776285] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.776403] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2172.776521] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2174.754167] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2174.754554] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2174.754638] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2174.754765] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2174.754917] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2174.767252] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2174.767427] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2174.767596] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2174.767754] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2174.768920] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea940a95-f223-4c23-b18f-f0183132b268 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2174.777822] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aafe7b4e-1dee-41c4-9e18-7d569f61a034 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2174.791904] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7a0af6b-429a-4fdf-bf22-44404f9963e3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2174.798128] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f978eee-d7dc-4b92-8f89-5fe9a3100bf8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2174.828611] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181249MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2174.828779] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2174.828975] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2174.896947] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 4864273c-b505-4e31-bf7b-633ba1e99562 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897127] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897262] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897386] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897505] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897622] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5aad5755-1a12-45a1-b30c-9a407992ad62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897736] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 432ef65e-4072-44d3-81c5-9371aacbb1c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897850] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 66228017-46bd-4709-8771-6a1947f8a643 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.897964] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2324cad4-a7e4-429a-8503-24e6b6f90033 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2174.898199] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2174.898309] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2175.009675] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eb62ff3-adae-47bf-819e-07945d7ed3d7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2175.017301] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13573a4f-d815-4487-8306-bbd14ecdacce {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2175.046132] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28d7fc42-30cc-4ea9-a718-af6ab81d753c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2175.053243] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69edeb1-4b5f-4a43-8049-74e959654eeb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2175.067475] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2175.075456] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2175.089407] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2175.089581] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2176.088754] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2179.748991] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2181.753936] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2202.230191] env[60788]: WARNING oslo_vmware.rw_handles [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2202.230191] env[60788]: ERROR oslo_vmware.rw_handles [ 2202.230861] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2202.232616] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2202.232853] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Copying Virtual Disk [datastore2] vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/55177c8d-5f5f-4ed4-a0a4-5ca7e9189a8e/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2202.233198] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8eee3fcb-3734-41cf-8398-b659d55d9c29 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.240585] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 2202.240585] env[60788]: value = "task-2205313" [ 2202.240585] env[60788]: _type = "Task" [ 2202.240585] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.248571] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': task-2205313, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2202.750778] env[60788]: DEBUG oslo_vmware.exceptions [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2202.751067] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2202.751619] env[60788]: ERROR nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2202.751619] env[60788]: Faults: ['InvalidArgument'] [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Traceback (most recent call last): [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] yield resources [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self.driver.spawn(context, instance, image_meta, [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self._fetch_image_if_missing(context, vi) [ 2202.751619] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] image_cache(vi, tmp_image_ds_loc) [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] vm_util.copy_virtual_disk( [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] session._wait_for_task(vmdk_copy_task) [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] return self.wait_for_task(task_ref) [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] return evt.wait() [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] result = hub.switch() [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2202.751967] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] return self.greenlet.switch() [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self.f(*self.args, **self.kw) [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] raise exceptions.translate_fault(task_info.error) [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Faults: ['InvalidArgument'] [ 2202.752362] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] [ 2202.752362] env[60788]: INFO nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Terminating instance [ 2202.753496] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2202.753699] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2202.753941] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-df4ea55b-6d70-442e-a0a1-c65f293a1368 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.757191] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2202.757384] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2202.758111] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a97fdf3f-902f-4b98-aace-9de65a710225 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.764803] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2202.765018] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-be90f667-189a-49ff-b131-761e919172af {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.767061] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2202.767239] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2202.768149] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dce07d25-1d67-419c-963d-0d45a497e37d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.772843] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 2202.772843] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]52eafbfe-d5f6-cf8d-f636-8e7442f1cafe" [ 2202.772843] env[60788]: _type = "Task" [ 2202.772843] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.786780] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2202.787031] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating directory with path [datastore2] vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2202.787221] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f06a7ea9-2bc4-476d-bb4b-cd48374abd25 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.806198] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Created directory with path [datastore2] vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2202.806385] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Fetch image to [datastore2] vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2202.806553] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2202.807367] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32c10011-70ae-4045-95ef-7eac76163d99 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.813543] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9bf4c34-402e-4d4c-9ae0-f3850a94778f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.822228] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a30f7c-4996-47bc-893e-3e9a95cc5f7d {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.854074] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-978670c3-671f-412a-838a-1b2a2c8f9835 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.856407] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2202.856595] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2202.856765] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Deleting the datastore file [datastore2] 4864273c-b505-4e31-bf7b-633ba1e99562 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2202.856983] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4cec6185-a2b8-42fa-a417-104e101453f6 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.862253] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4d2d04e2-0c52-4644-a939-c44dd488f8a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.863857] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for the task: (returnval){ [ 2202.863857] env[60788]: value = "task-2205315" [ 2202.863857] env[60788]: _type = "Task" [ 2202.863857] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.870939] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': task-2205315, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2202.884147] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2202.936469] env[60788]: DEBUG oslo_vmware.rw_handles [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2202.995946] env[60788]: DEBUG oslo_vmware.rw_handles [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2202.996172] env[60788]: DEBUG oslo_vmware.rw_handles [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2203.374096] env[60788]: DEBUG oslo_vmware.api [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Task: {'id': task-2205315, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068453} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2203.374353] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2203.374536] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2203.374707] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2203.374880] env[60788]: INFO nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2203.377012] env[60788]: DEBUG nova.compute.claims [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2203.377197] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2203.377437] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2203.526056] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-317b5d64-7340-4a60-921c-14239fa48e4e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.533269] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2c91dc9-cc0e-4379-b337-13531d35bc38 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.562564] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e87d611-5c8b-4252-873d-6c66e21e3ab8 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.569297] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c9a1218-a44e-443c-b077-ea3587ad05ae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.583170] env[60788]: DEBUG nova.compute.provider_tree [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2203.591456] env[60788]: DEBUG nova.scheduler.client.report [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2203.606827] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.229s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2203.607351] env[60788]: ERROR nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2203.607351] env[60788]: Faults: ['InvalidArgument'] [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Traceback (most recent call last): [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self.driver.spawn(context, instance, image_meta, [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self._fetch_image_if_missing(context, vi) [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] image_cache(vi, tmp_image_ds_loc) [ 2203.607351] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] vm_util.copy_virtual_disk( [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] session._wait_for_task(vmdk_copy_task) [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] return self.wait_for_task(task_ref) [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] return evt.wait() [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] result = hub.switch() [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] return self.greenlet.switch() [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2203.607701] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] self.f(*self.args, **self.kw) [ 2203.608055] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2203.608055] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] raise exceptions.translate_fault(task_info.error) [ 2203.608055] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2203.608055] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Faults: ['InvalidArgument'] [ 2203.608055] env[60788]: ERROR nova.compute.manager [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] [ 2203.608055] env[60788]: DEBUG nova.compute.utils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2203.609432] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Build of instance 4864273c-b505-4e31-bf7b-633ba1e99562 was re-scheduled: A specified parameter was not correct: fileType [ 2203.609432] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2203.609799] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2203.609969] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2203.610185] env[60788]: DEBUG nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2203.610361] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2203.873463] env[60788]: DEBUG nova.network.neutron [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2203.885637] env[60788]: INFO nova.compute.manager [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Took 0.28 seconds to deallocate network for instance. [ 2203.979063] env[60788]: INFO nova.scheduler.client.report [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Deleted allocations for instance 4864273c-b505-4e31-bf7b-633ba1e99562 [ 2203.999965] env[60788]: DEBUG oslo_concurrency.lockutils [None req-316bff9b-5b66-4f02-8125-85072281eec8 tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "4864273c-b505-4e31-bf7b-633ba1e99562" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 535.855s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.000291] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "4864273c-b505-4e31-bf7b-633ba1e99562" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 339.844s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.000455] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "4864273c-b505-4e31-bf7b-633ba1e99562-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2204.000662] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "4864273c-b505-4e31-bf7b-633ba1e99562-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.000829] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "4864273c-b505-4e31-bf7b-633ba1e99562-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.002840] env[60788]: INFO nova.compute.manager [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Terminating instance [ 2204.004744] env[60788]: DEBUG nova.compute.manager [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2204.005049] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2204.005536] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b0df992-1c44-43ca-911d-3e47f044d09f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.015417] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8771f026-c065-4294-939f-2641cdd01864 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.042688] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4864273c-b505-4e31-bf7b-633ba1e99562 could not be found. [ 2204.042894] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2204.043108] env[60788]: INFO nova.compute.manager [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2204.043355] env[60788]: DEBUG oslo.service.loopingcall [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2204.043574] env[60788]: DEBUG nova.compute.manager [-] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2204.043670] env[60788]: DEBUG nova.network.neutron [-] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2204.066979] env[60788]: DEBUG nova.network.neutron [-] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2204.074475] env[60788]: INFO nova.compute.manager [-] [instance: 4864273c-b505-4e31-bf7b-633ba1e99562] Took 0.03 seconds to deallocate network for instance. [ 2204.154439] env[60788]: DEBUG oslo_concurrency.lockutils [None req-0e713bc1-d9dc-4a1b-97fd-286640aed7ee tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Lock "4864273c-b505-4e31-bf7b-633ba1e99562" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2233.754585] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2234.754414] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2234.754552] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2234.754670] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2234.773923] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774118] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774229] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774380] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774509] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774631] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774753] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774872] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2234.774993] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2234.775458] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2234.775634] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2234.775796] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2234.787994] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2234.788240] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2234.788398] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2234.788548] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2234.789676] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-839895d5-442a-4459-a662-d8ae357a9994 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.799514] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a71dee3-2f7a-4b07-bf50-129c3e03c8e5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.813141] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6d28ab2-dbbd-4e61-9fe0-9011576c666b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.818999] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8b52e1-2932-49e1-b1d7-d1da8769766e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2234.847306] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181226MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2234.847445] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2234.847633] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2234.910681] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance db89c7e8-6d81-4c0a-9111-9f6256588967 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.910854] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911011] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911151] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911276] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5aad5755-1a12-45a1-b30c-9a407992ad62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911398] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 432ef65e-4072-44d3-81c5-9371aacbb1c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911520] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 66228017-46bd-4709-8771-6a1947f8a643 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911632] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2324cad4-a7e4-429a-8503-24e6b6f90033 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2234.911811] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2234.911947] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2235.005534] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2a81001-4be0-4113-b563-404a66752631 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.012969] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a492d630-feb9-4b55-a677-09f648ae6e3a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.043262] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f76ca66-a2ef-4efe-9845-7c7732cd5af5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.050100] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1800947e-4d21-4d12-8bd4-2ac1426632ba {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2235.062837] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2235.070744] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2235.084590] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2235.084797] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2236.062883] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2236.753795] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2236.753968] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2239.750109] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2242.754414] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2252.749272] env[60788]: WARNING oslo_vmware.rw_handles [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2252.749272] env[60788]: ERROR oslo_vmware.rw_handles [ 2252.749272] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2252.750274] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2252.750274] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Copying Virtual Disk [datastore2] vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/0ada4f40-2dc5-4419-8494-65ed213e21ee/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2252.750394] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3baea7b4-5c2a-4eca-b2e3-7f849a46bd34 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.761398] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 2252.761398] env[60788]: value = "task-2205316" [ 2252.761398] env[60788]: _type = "Task" [ 2252.761398] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2252.769942] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205316, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2253.272056] env[60788]: DEBUG oslo_vmware.exceptions [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2253.272294] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2253.272872] env[60788]: ERROR nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2253.272872] env[60788]: Faults: ['InvalidArgument'] [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Traceback (most recent call last): [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] yield resources [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self.driver.spawn(context, instance, image_meta, [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self._fetch_image_if_missing(context, vi) [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2253.272872] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] image_cache(vi, tmp_image_ds_loc) [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] vm_util.copy_virtual_disk( [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] session._wait_for_task(vmdk_copy_task) [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] return self.wait_for_task(task_ref) [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] return evt.wait() [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] result = hub.switch() [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] return self.greenlet.switch() [ 2253.273271] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2253.273578] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self.f(*self.args, **self.kw) [ 2253.273578] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2253.273578] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] raise exceptions.translate_fault(task_info.error) [ 2253.273578] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2253.273578] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Faults: ['InvalidArgument'] [ 2253.273578] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] [ 2253.273578] env[60788]: INFO nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Terminating instance [ 2253.274760] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2253.274970] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2253.275229] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ac3e1dd-95ce-4d60-9ae6-9de0b8013ac1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.277622] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2253.277812] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2253.278552] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ba046c-71e7-4be3-93a0-733b19e770b3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.285188] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2253.285436] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-082cf9e3-fa91-472c-b214-3f194a067c23 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.287846] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2253.288033] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2253.289202] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e16307c9-8026-41cd-a112-8b00f8f925a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.293837] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 2253.293837] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5209a114-96a7-f626-ed39-cfd776400375" [ 2253.293837] env[60788]: _type = "Task" [ 2253.293837] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2253.301391] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5209a114-96a7-f626-ed39-cfd776400375, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2253.353992] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2253.354246] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2253.354405] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleting the datastore file [datastore2] db89c7e8-6d81-4c0a-9111-9f6256588967 {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2253.354670] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ed1fc9e2-1c54-46a8-943b-d053abc798c7 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.360743] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for the task: (returnval){ [ 2253.360743] env[60788]: value = "task-2205318" [ 2253.360743] env[60788]: _type = "Task" [ 2253.360743] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2253.368150] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205318, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2253.804170] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2253.804542] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating directory with path [datastore2] vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2253.804650] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-481d3528-f1f4-404b-9155-8a83e5e446fb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.817100] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Created directory with path [datastore2] vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2253.817291] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Fetch image to [datastore2] vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2253.817465] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2253.818175] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1da254b-2b7e-4a62-b5ff-a1dca571177f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.824361] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5e1c95a-76ab-4599-ab52-d443ccc66b3c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.834112] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c2ba382-346e-4c33-9ed9-440bde3bd4d9 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.866463] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc7a6d3f-154b-4bbc-b4fa-f10e93d6cc0f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.873366] env[60788]: DEBUG oslo_vmware.api [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Task: {'id': task-2205318, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078525} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2253.874783] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2253.874977] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2253.875166] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2253.875464] env[60788]: INFO nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2253.877098] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-714d1af7-3cbe-4f70-9e7e-8f2f3cadadcf {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.878930] env[60788]: DEBUG nova.compute.claims [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2253.879119] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2253.879337] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2253.901677] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2253.957012] env[60788]: DEBUG oslo_vmware.rw_handles [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2254.018068] env[60788]: DEBUG oslo_vmware.rw_handles [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2254.018307] env[60788]: DEBUG oslo_vmware.rw_handles [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2254.085278] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d036c8dc-9e86-4b71-bfae-2a670ab40768 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.093146] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cc4ad30-3b7d-40ae-a845-8d87d3545a03 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.124901] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ca37cc4-a304-4bc2-8103-bef212d7eba5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.132043] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b64e177-d122-4bd3-a0a8-2016c16d72a4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.145412] env[60788]: DEBUG nova.compute.provider_tree [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2254.155801] env[60788]: DEBUG nova.scheduler.client.report [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2254.169154] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.290s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2254.169699] env[60788]: ERROR nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2254.169699] env[60788]: Faults: ['InvalidArgument'] [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Traceback (most recent call last): [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self.driver.spawn(context, instance, image_meta, [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self._fetch_image_if_missing(context, vi) [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] image_cache(vi, tmp_image_ds_loc) [ 2254.169699] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] vm_util.copy_virtual_disk( [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] session._wait_for_task(vmdk_copy_task) [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] return self.wait_for_task(task_ref) [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] return evt.wait() [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] result = hub.switch() [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] return self.greenlet.switch() [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2254.170054] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] self.f(*self.args, **self.kw) [ 2254.170397] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2254.170397] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] raise exceptions.translate_fault(task_info.error) [ 2254.170397] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2254.170397] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Faults: ['InvalidArgument'] [ 2254.170397] env[60788]: ERROR nova.compute.manager [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] [ 2254.170534] env[60788]: DEBUG nova.compute.utils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2254.171847] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Build of instance db89c7e8-6d81-4c0a-9111-9f6256588967 was re-scheduled: A specified parameter was not correct: fileType [ 2254.171847] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2254.172244] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2254.172419] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2254.172591] env[60788]: DEBUG nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2254.172753] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2254.474198] env[60788]: DEBUG nova.network.neutron [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2254.483685] env[60788]: INFO nova.compute.manager [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Took 0.31 seconds to deallocate network for instance. [ 2254.579513] env[60788]: INFO nova.scheduler.client.report [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Deleted allocations for instance db89c7e8-6d81-4c0a-9111-9f6256588967 [ 2254.599543] env[60788]: DEBUG oslo_concurrency.lockutils [None req-28ef2b44-62b6-4673-97b0-0f6ced84983c tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 535.259s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2254.599798] env[60788]: DEBUG oslo_concurrency.lockutils [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 338.843s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2254.600024] env[60788]: DEBUG oslo_concurrency.lockutils [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Acquiring lock "db89c7e8-6d81-4c0a-9111-9f6256588967-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2254.600234] env[60788]: DEBUG oslo_concurrency.lockutils [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2254.600406] env[60788]: DEBUG oslo_concurrency.lockutils [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2254.602250] env[60788]: INFO nova.compute.manager [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Terminating instance [ 2254.603918] env[60788]: DEBUG nova.compute.manager [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2254.604140] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2254.604616] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2a39592a-fb88-4a22-9c0d-70cbbc36e782 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.613375] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c67814d5-3ace-40f0-9108-60a4bf2d917a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2254.639248] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance db89c7e8-6d81-4c0a-9111-9f6256588967 could not be found. [ 2254.639455] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2254.639631] env[60788]: INFO nova.compute.manager [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2254.639865] env[60788]: DEBUG oslo.service.loopingcall [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2254.640307] env[60788]: DEBUG nova.compute.manager [-] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2254.640412] env[60788]: DEBUG nova.network.neutron [-] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2254.660049] env[60788]: DEBUG nova.network.neutron [-] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2254.667604] env[60788]: INFO nova.compute.manager [-] [instance: db89c7e8-6d81-4c0a-9111-9f6256588967] Took 0.03 seconds to deallocate network for instance. [ 2254.753269] env[60788]: DEBUG oslo_concurrency.lockutils [None req-45cbd069-cd04-4ca0-8235-04d84f95d2a8 tempest-ImagesTestJSON-807282581 tempest-ImagesTestJSON-807282581-project-member] Lock "db89c7e8-6d81-4c0a-9111-9f6256588967" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.153s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2258.833564] env[60788]: DEBUG oslo_concurrency.lockutils [None req-3a6c2235-b520-4ce9-abd5-bbf30b89678d tempest-AttachInterfacesTestJSON-1830527930 tempest-AttachInterfacesTestJSON-1830527930-project-member] Acquiring lock "5aad5755-1a12-45a1-b30c-9a407992ad62" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2275.731154] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2275.731620] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Getting list of instances from cluster (obj){ [ 2275.731620] env[60788]: value = "domain-c8" [ 2275.731620] env[60788]: _type = "ClusterComputeResource" [ 2275.731620] env[60788]: } {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2275.732640] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddec0b7e-4964-4289-9318-8e196a877dc5 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2275.747649] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Got total of 7 instances {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2282.741710] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2282.760079] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Getting list of instances from cluster (obj){ [ 2282.760079] env[60788]: value = "domain-c8" [ 2282.760079] env[60788]: _type = "ClusterComputeResource" [ 2282.760079] env[60788]: } {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2282.761463] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22c6ed58-79b7-45af-9db8-162bb0f14035 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2282.776607] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Got total of 7 instances {{(pid=60788) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2282.776773] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid dbf41f65-ac34-4da6-837d-9d4e924fcf7c {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.776962] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 6df14da6-6e82-4573-8dc3-27f8349e586f {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.777141] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid c4697916-5d18-4d2b-9e12-91801de44580 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.777301] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 5aad5755-1a12-45a1-b30c-9a407992ad62 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.777454] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 432ef65e-4072-44d3-81c5-9371aacbb1c2 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.777600] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 66228017-46bd-4709-8771-6a1947f8a643 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.777749] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Triggering sync for uuid 2324cad4-a7e4-429a-8503-24e6b6f90033 {{(pid=60788) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10322}} [ 2282.778057] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.778289] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "6df14da6-6e82-4573-8dc3-27f8349e586f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.778486] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "c4697916-5d18-4d2b-9e12-91801de44580" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.778675] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "5aad5755-1a12-45a1-b30c-9a407992ad62" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.778863] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "432ef65e-4072-44d3-81c5-9371aacbb1c2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.779058] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "66228017-46bd-4709-8771-6a1947f8a643" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2282.779252] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "2324cad4-a7e4-429a-8503-24e6b6f90033" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2290.753953] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2291.757230] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2294.754706] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2294.755035] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2294.767442] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2294.767676] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2294.767840] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2294.767997] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60788) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2294.769463] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d6eb9bf-988a-498c-8a6e-bb1699b0d592 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2294.778118] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f95cc4d-75d9-49d6-9e48-214034b2eeda {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2294.791984] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd682aa3-345e-4645-884c-6c2ca6ef4d95 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2294.798106] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc3bcb3f-54c5-4842-bc6e-0726c64fbfd2 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2294.826617] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181250MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=60788) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2294.826751] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2294.826934] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2294.965065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 6df14da6-6e82-4573-8dc3-27f8349e586f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance c4697916-5d18-4d2b-9e12-91801de44580 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965065] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 5aad5755-1a12-45a1-b30c-9a407992ad62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965348] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 432ef65e-4072-44d3-81c5-9371aacbb1c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965348] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 66228017-46bd-4709-8771-6a1947f8a643 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965348] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Instance 2324cad4-a7e4-429a-8503-24e6b6f90033 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60788) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2294.965556] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2294.965716] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60788) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2294.981532] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing inventories for resource provider 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2294.994355] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating ProviderTree inventory for provider 75623588-d529-4955-b0d7-8c3260d605e7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2294.994548] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Updating inventory in ProviderTree for provider 75623588-d529-4955-b0d7-8c3260d605e7 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2295.004767] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing aggregate associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, aggregates: None {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2295.020804] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Refreshing trait associations for resource provider 75623588-d529-4955-b0d7-8c3260d605e7, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60788) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2295.098651] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-123cf40c-6cdb-4788-9d28-c2872047f083 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.106130] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d179a081-2cf3-417b-899f-757567876f5e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.136559] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1315cf9-07a4-4898-a8e8-8387ff5ceee3 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.142966] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5089bb11-dbc5-43ce-bb71-833ced899183 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2295.155971] env[60788]: DEBUG nova.compute.provider_tree [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2295.164483] env[60788]: DEBUG nova.scheduler.client.report [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2295.177695] env[60788]: DEBUG nova.compute.resource_tracker [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60788) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2295.177874] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2296.176835] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2296.177199] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Starting heal instance info cache {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9912}} [ 2296.177199] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Rebuilding the list of instances to heal {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9916}} [ 2296.195949] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196095] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196230] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: c4697916-5d18-4d2b-9e12-91801de44580] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196360] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 5aad5755-1a12-45a1-b30c-9a407992ad62] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196480] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 432ef65e-4072-44d3-81c5-9371aacbb1c2] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196599] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 66228017-46bd-4709-8771-6a1947f8a643] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196722] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: 2324cad4-a7e4-429a-8503-24e6b6f90033] Skipping network cache update for instance because it is Building. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9925}} [ 2296.196876] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Didn't find any instances for network info cache update. {{(pid=60788) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9998}} [ 2296.197329] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2296.197504] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2296.754024] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2298.754109] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2298.754601] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60788) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10531}} [ 2299.749043] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.755523] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2301.755874] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances with incomplete migration {{(pid=60788) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11237}} [ 2302.762867] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2303.315647] env[60788]: WARNING oslo_vmware.rw_handles [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles response.begin() [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2303.315647] env[60788]: ERROR oslo_vmware.rw_handles [ 2303.315647] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Downloaded image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2303.317942] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Caching image {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2303.318213] env[60788]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Copying Virtual Disk [datastore2] vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk to [datastore2] vmware_temp/7ade94a8-d9cf-44a7-a80a-8cc36c4f124c/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk {{(pid=60788) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2303.318496] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9069b271-7d8e-49e5-89b9-aed6d2df893b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.327365] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 2303.327365] env[60788]: value = "task-2205319" [ 2303.327365] env[60788]: _type = "Task" [ 2303.327365] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2303.335773] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': task-2205319, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2303.836962] env[60788]: DEBUG oslo_vmware.exceptions [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Fault InvalidArgument not matched. {{(pid=60788) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2303.837246] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2303.837864] env[60788]: ERROR nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2303.837864] env[60788]: Faults: ['InvalidArgument'] [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Traceback (most recent call last): [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/compute/manager.py", line 2869, in _build_resources [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] yield resources [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self.driver.spawn(context, instance, image_meta, [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self._fetch_image_if_missing(context, vi) [ 2303.837864] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] image_cache(vi, tmp_image_ds_loc) [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] vm_util.copy_virtual_disk( [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] session._wait_for_task(vmdk_copy_task) [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] return self.wait_for_task(task_ref) [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] return evt.wait() [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] result = hub.switch() [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2303.838174] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] return self.greenlet.switch() [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self.f(*self.args, **self.kw) [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] raise exceptions.translate_fault(task_info.error) [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Faults: ['InvalidArgument'] [ 2303.838474] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] [ 2303.838474] env[60788]: INFO nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Terminating instance [ 2303.839806] env[60788]: DEBUG oslo_concurrency.lockutils [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/1d9d6f6c-1335-48c8-9690-b6c8e781cb21.vmdk" {{(pid=60788) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2303.840021] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2303.840268] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4620c6df-ef1d-4061-9fac-ee2602d01eed {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.842415] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2303.842610] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2303.843346] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd999b41-a969-4e3b-aab7-5f15ff9211bc {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.849740] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Unregistering the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2303.849948] env[60788]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-59db842d-319d-44b5-953d-c6118d2766ae {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.852056] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2303.852231] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60788) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2303.853187] env[60788]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85b4a7bb-65be-42de-a8c4-6d72ba08a3bb {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.857614] env[60788]: DEBUG oslo_vmware.api [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Waiting for the task: (returnval){ [ 2303.857614] env[60788]: value = "session[524dea58-2fef-0771-5b3e-5e0329fde636]5244f753-e412-502b-16cc-7a3bf03a7364" [ 2303.857614] env[60788]: _type = "Task" [ 2303.857614] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2303.870134] env[60788]: DEBUG oslo_vmware.api [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Task: {'id': session[524dea58-2fef-0771-5b3e-5e0329fde636]5244f753-e412-502b-16cc-7a3bf03a7364, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2303.914257] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Unregistered the VM {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2303.914490] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Deleting contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2303.914634] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Deleting the datastore file [datastore2] dbf41f65-ac34-4da6-837d-9d4e924fcf7c {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2303.914898] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ae08caf6-c3c3-4478-858e-83ab6bb93ba4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2303.921126] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for the task: (returnval){ [ 2303.921126] env[60788]: value = "task-2205321" [ 2303.921126] env[60788]: _type = "Task" [ 2303.921126] env[60788]: } to complete. {{(pid=60788) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2303.930222] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': task-2205321, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2304.368084] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Preparing fetch location {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2304.368356] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating directory with path [datastore2] vmware_temp/89de91dd-19e3-47b1-81a7-459c63b8fa66/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2304.368588] env[60788]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-63572ab2-0db2-4680-9b97-038e967bd49c {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.379608] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Created directory with path [datastore2] vmware_temp/89de91dd-19e3-47b1-81a7-459c63b8fa66/1d9d6f6c-1335-48c8-9690-b6c8e781cb21 {{(pid=60788) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2304.379782] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Fetch image to [datastore2] vmware_temp/89de91dd-19e3-47b1-81a7-459c63b8fa66/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk {{(pid=60788) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2304.379947] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to [datastore2] vmware_temp/89de91dd-19e3-47b1-81a7-459c63b8fa66/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk on the data store datastore2 {{(pid=60788) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2304.380636] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fea4c7f4-e562-41b4-b961-54188ede2dc1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.386687] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77303c2a-e225-4e02-aa82-14bdf3cb683e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.395626] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d16c882a-219f-43b1-8c67-b819f98d4fd4 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.427812] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b72f6ab-7eb7-4d60-945a-eeef3cea4980 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.434314] env[60788]: DEBUG oslo_vmware.api [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Task: {'id': task-2205321, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075146} completed successfully. {{(pid=60788) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2304.435645] env[60788]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Deleted the datastore file {{(pid=60788) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2304.435836] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Deleted contents of the VM from datastore datastore2 {{(pid=60788) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2304.436017] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2304.436197] env[60788]: INFO nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2304.437880] env[60788]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4f695391-f3ea-48db-9f12-af6462af064a {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.439655] env[60788]: DEBUG nova.compute.claims [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Aborting claim: {{(pid=60788) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2304.439821] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2304.440039] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2304.461147] env[60788]: DEBUG nova.virt.vmwareapi.images [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] [instance: 6df14da6-6e82-4573-8dc3-27f8349e586f] Downloading image file data 1d9d6f6c-1335-48c8-9690-b6c8e781cb21 to the data store datastore2 {{(pid=60788) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2304.511363] env[60788]: DEBUG oslo_vmware.rw_handles [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/89de91dd-19e3-47b1-81a7-459c63b8fa66/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2304.571458] env[60788]: DEBUG oslo_vmware.rw_handles [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Completed reading data from the image iterator. {{(pid=60788) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2304.571607] env[60788]: DEBUG oslo_vmware.rw_handles [None req-e1b0c526-4186-4301-90a7-e35cf889a84e tempest-ServersTestJSON-960083798 tempest-ServersTestJSON-960083798-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/89de91dd-19e3-47b1-81a7-459c63b8fa66/1d9d6f6c-1335-48c8-9690-b6c8e781cb21/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60788) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2304.631116] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b91f05bc-d075-428e-a593-7f5a05279e0b {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.638874] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e763367-fe9b-40af-abc6-b419c2cdee5e {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.669758] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1493449-84bd-4116-9756-27ef1379d645 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.676920] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-647c5a57-1032-4e4c-a1df-6a00a0402918 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2304.689812] env[60788]: DEBUG nova.compute.provider_tree [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed in ProviderTree for provider: 75623588-d529-4955-b0d7-8c3260d605e7 {{(pid=60788) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2304.698096] env[60788]: DEBUG nova.scheduler.client.report [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Inventory has not changed for provider 75623588-d529-4955-b0d7-8c3260d605e7 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60788) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2304.711010] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.271s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2304.711656] env[60788]: ERROR nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2304.711656] env[60788]: Faults: ['InvalidArgument'] [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Traceback (most recent call last): [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/compute/manager.py", line 2616, in _build_and_run_instance [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self.driver.spawn(context, instance, image_meta, [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self._fetch_image_if_missing(context, vi) [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] image_cache(vi, tmp_image_ds_loc) [ 2304.711656] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] vm_util.copy_virtual_disk( [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] session._wait_for_task(vmdk_copy_task) [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] return self.wait_for_task(task_ref) [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] return evt.wait() [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] result = hub.switch() [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] return self.greenlet.switch() [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2304.711987] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] self.f(*self.args, **self.kw) [ 2304.712332] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2304.712332] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] raise exceptions.translate_fault(task_info.error) [ 2304.712332] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2304.712332] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Faults: ['InvalidArgument'] [ 2304.712332] env[60788]: ERROR nova.compute.manager [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] [ 2304.712332] env[60788]: DEBUG nova.compute.utils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] VimFaultException {{(pid=60788) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2304.713582] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Build of instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c was re-scheduled: A specified parameter was not correct: fileType [ 2304.713582] env[60788]: Faults: ['InvalidArgument'] {{(pid=60788) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2455}} [ 2304.713942] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Unplugging VIFs for instance {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2981}} [ 2304.714126] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60788) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3004}} [ 2304.714316] env[60788]: DEBUG nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2304.714544] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2305.142611] env[60788]: DEBUG nova.network.neutron [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2305.153196] env[60788]: INFO nova.compute.manager [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Took 0.44 seconds to deallocate network for instance. [ 2305.243161] env[60788]: INFO nova.scheduler.client.report [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Deleted allocations for instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c [ 2305.262395] env[60788]: DEBUG oslo_concurrency.lockutils [None req-cf4999f7-412d-450b-95a2-164129390660 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 532.099s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2305.262668] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 335.651s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2305.262892] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Acquiring lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2305.263110] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2305.263316] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2305.265397] env[60788]: INFO nova.compute.manager [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Terminating instance [ 2305.267285] env[60788]: DEBUG nova.compute.manager [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Start destroying the instance on the hypervisor. {{(pid=60788) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3125}} [ 2305.267481] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Destroying instance {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2305.267963] env[60788]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3224d391-0bab-4969-9410-492c6a3c2ec1 {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2305.278350] env[60788]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-212097e1-81aa-43dc-aee0-be4c5fb07b7f {{(pid=60788) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2305.305837] env[60788]: WARNING nova.virt.vmwareapi.vmops [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dbf41f65-ac34-4da6-837d-9d4e924fcf7c could not be found. [ 2305.306050] env[60788]: DEBUG nova.virt.vmwareapi.vmops [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Instance destroyed {{(pid=60788) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2305.306244] env[60788]: INFO nova.compute.manager [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2305.306495] env[60788]: DEBUG oslo.service.loopingcall [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60788) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2305.306975] env[60788]: DEBUG nova.compute.manager [-] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Deallocating network for instance {{(pid=60788) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} [ 2305.307102] env[60788]: DEBUG nova.network.neutron [-] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] deallocate_for_instance() {{(pid=60788) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2305.334407] env[60788]: DEBUG nova.network.neutron [-] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Updating instance_info_cache with network_info: [] {{(pid=60788) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2305.343020] env[60788]: INFO nova.compute.manager [-] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] Took 0.04 seconds to deallocate network for instance. [ 2305.440206] env[60788]: DEBUG oslo_concurrency.lockutils [None req-f51162cf-b81e-4c82-8e11-5210f255cf49 tempest-DeleteServersTestJSON-1447308408 tempest-DeleteServersTestJSON-1447308408-project-member] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.177s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2305.441036] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 22.663s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2305.441238] env[60788]: INFO nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] [instance: dbf41f65-ac34-4da6-837d-9d4e924fcf7c] During sync_power_state the instance has a pending task (deleting). Skip. [ 2305.441430] env[60788]: DEBUG oslo_concurrency.lockutils [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Lock "dbf41f65-ac34-4da6-837d-9d4e924fcf7c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2306.753859] env[60788]: DEBUG oslo_service.periodic_task [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60788) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2306.754282] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] Cleaning up deleted instances {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11199}} [ 2306.764877] env[60788]: DEBUG nova.compute.manager [None req-538cdfc0-7778-46a4-8aa9-a7911daf8060 None None] There are 0 instances to clean {{(pid=60788) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11208}} [ 2315.528631] env[60788]: DEBUG oslo_concurrency.lockutils [None req-c777e961-068c-447a-ac5f-d135cf84fa5c tempest-ListImageFiltersTestJSON-1385008548 tempest-ListImageFiltersTestJSON-1385008548-project-member] Acquiring lock "66228017-46bd-4709-8771-6a1947f8a643" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60788) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}}